Help need to improve performance :Parallel processing ideas


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Help need to improve performance :Parallel processing ideas
# 1  
Old 09-21-2010
Help need to improve performance :Parallel processing ideas

Hi,
Please tell me how to include parallel processing for the below code. Thanks in advance
I have a list of users directories in root directory. Each user has a directory by his /her username.
I am finding the size of each directorry using du -g command.. and checking if the size exceeds 3GB a limit.
The problem is that it takes around 30 minutes for around 1000 users.

Code:
for i in `ls -l | grep -i <username>`
do
du -g $i | awk '{if ($1 > 3) print $0}' >> size.txt
done

# 2  
Old 09-21-2010
try this
Code:
ls -1  | xargs -n 100 | while read entries
do
du -shk "$entries" >> out.txt &
done

# 3  
Old 09-21-2010
Quote:
for i in `ls -l | grep -i <username>`
do
du -g $i | awk '{if ($1 > 3) print $0}' >> size.txt
done
The script as posted does not work for several reasons. For example, where does "<username>" come from? What is "ls -l" for? What does the "list of users directories in root directory" look like, what created the file, and where is that file?
Please post the script you actually used.

If these are user home directories (same as those in /etc/passwd) there are much easier ways of finding the totals. I wouldn't expect user home directories to be directly under the root directory so maybe this is not what you are trying to do.
# 4  
Old 09-21-2010
Is there any reason to believe doing the lookups in parallel will be faster? The performance limiter is probably going to be how fast the data can be retrieved from disk anyway.

It'd also probably be faster to just pass the file names to one instance of du instead of running du over and over again.
# 5  
Old 09-21-2010
Quote:
t'd also probably be faster to just pass the file names to one instance of du instead of running du over and over again.
I think max parameter it can take is 256, hence I have restricted it to 100 in my example
# 6  
Old 09-21-2010
I note that matrixmadhan has used "ls -1" (number one) which makes more sense than the "ls -l" (letter ell) in the original post.
Because the original post contains "du -g" I wonder if this is an IBM AIX machine? i.e. one with a very limited command line length.
# 7  
Old 09-21-2010
just
Code:
ls -1

should do and can be streamed to loop constructs without any extraction or special parsing

Independent of the shell, underlying OS there is a max number of values that can be given as arguments to any command
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Programming

Improve the performance of my C++ code

Hello, Attached is my very simple C++ code to remove any substrings (DNA sequence) of each other, i.e. any redundant sequence is removed to get unique sequences. Similar to sort | uniq command except there is reverse-complementary for DNA sequence. The program runs well with small dataset, but... (11 Replies)
Discussion started by: yifangt
11 Replies

2. UNIX for Dummies Questions & Answers

How to improve the performance of this script?

Hi , i wrote a script to convert dates to the formate i want .it works fine but the conversion is tkaing lot of time . Can some one help me tweek this script #!/bin/bash file=$1 ofile=$2 cp $file $ofile mydates=$(grep -Po '+/+/+' $ofile) # gets 8/1/13 mydates=$(echo "$mydates" | sort |... (5 Replies)
Discussion started by: vikatakavi
5 Replies

3. Programming

Help with improve the performance of grep

Input file: #content_1 12314345345 242467 #content_14 436677645 576577657 #content_100 3425546 56 #content_12 243254546 1232454 . . Reference file: content_100 (1 Reply)
Discussion started by: cpp_beginner
1 Replies

4. Shell Programming and Scripting

Want to improve the performance of script

Hi All, I have written a script as follows which is taking lot of time in executing/searching only 3500 records taken as input from one file in log file of 12 GB Approximately. Working of script is read the csv file as an input having 2 arguments which are transaction_id,mobile_number and search... (6 Replies)
Discussion started by: poweroflinux
6 Replies

5. Shell Programming and Scripting

How to make parallel processing rather than serial processing ??

Hello everybody, I have a little problem with one of my program. I made a plugin for collectd (a stats collector for my servers) but I have a problem to make it run in parallel. My program gathers stats from logs, so it needs to run in background waiting for any new lines added in the log... (0 Replies)
Discussion started by: Samb95
0 Replies

6. Shell Programming and Scripting

Improve the performance of a shell script

Hi Friends, I wrote the below shell script to generate a report on alert messages recieved on a day. But i for processing around 4500 lines (alerts) the script is taking aorund 30 minutes to process. Please help me to make it faster and improve the performace of the script. i would be very... (10 Replies)
Discussion started by: apsprabhu
10 Replies

7. Shell Programming and Scripting

Any way to improve performance of this script

I have a data file of 2 gig I need to do all these, but its taking hours, any where i can improve performance, thanks a lot #!/usr/bin/ksh echo TIMESTAMP="$(date +'_%y-%m-%d.%H-%M-%S')" function showHelp { cat << EOF >&2 syntax extreme.sh FILENAME Specify filename to parse EOF... (3 Replies)
Discussion started by: sirababu
3 Replies

8. UNIX for Dummies Questions & Answers

Improve Performance

hi someone tell me which ways i can improve disk I/O and system process performance.kindly refer some commands so i can do it on my test machine.thanks, Mazhar (2 Replies)
Discussion started by: mazhar99
2 Replies

9. Shell Programming and Scripting

How to improve grep performance...

Hi All, I am using grep command to find string "abc" in one file . content of file is *********** abc = xyz def= lmn ************ i have given the below mentioned command to redirect the output to tmp file grep abc file | sort -u | awk '{print #3}' > out_file Then i am searching... (2 Replies)
Discussion started by: pooga17
2 Replies

10. UNIX for Advanced & Expert Users

improve performance by using ls better than find

Hi , i'm searching for files over many Aix servers with rsh command using this request : find /dir1 -name '*.' -exec ls {} \; and then count them with "wc" but i would improve this search because it's too long and replace directly find with ls command but "ls *. " doesn't work. and... (3 Replies)
Discussion started by: Nicol
3 Replies
Login or Register to Ask a Question