help to parallelize work on thousands of files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting help to parallelize work on thousands of files
# 8  
Old 07-11-2010
Hi.

The Linux xargs has a feature to perform this kind of task:
Code:
       --max-procs=max-procs
       -P max-procs
              Run up to max-procs processes at a time; the default is  1.   If
              max-procs  is 0, xargs will run as many processes as possible at
              a time.

See man xargs for details ... cheers, drl
# 9  
Old 07-11-2010
BTW, ksh93t has built-in support for automatically limiting the number of background jobs that run at the same time. See the MAXJOBS parameter. You can also use the SIGCHLD trap to find out which background job has completed and get it's exit status.
This User Gave Thanks to fpmurphy For This Post:
# 10  
Old 07-12-2010
Thanks! A small correction: I think it is ksh93t+ and higher and the variable is call JOBMAX. It works nicely. If you do:
Code:
#!/bin/ksh
JOBMAX=10  # works in ksh93t+
for i in *
do
  sleep 10 &
done
wait

ps will show 11 processes at every moment including the parent.

Last edited by Scrutinizer; 07-12-2010 at 02:30 AM..
# 11  
Old 07-12-2010
Quote:
I think it is ksh93t+ and higher and the variable is call JOBMAX
My mistake. It was late. I took it from a Dave Korn email without checking the sources.

https://mailman.research.att.com/pip...q2/002931.html

For completeness, here is the ksh93 release note on JOBMAX.

08-12-04 +SHOPT_BGX enables background job extensions. Noted by "J" in
the version string when enabled. (1) JOBMAX=n limits the number
of concurrent & jobs to n; the n+1 & job will block until a
running background job completes. (2) SIGCHLD traps are queued
so that each completing background job gets its own trap; $! is
set to the job pid and $? is set to the job exit status at the
beginning of the trap. (3) sleep -s added to sleep until the time
expires or until a signal is delivered.

Last edited by fpmurphy; 07-12-2010 at 09:47 PM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Parallelize bash commands/jobs

Hello, I have a bunch of jobs (cp, cat or ln -s) on big files (10xGB in size): # commands_script.sh: cp file1 path1/newfile1 cp file2 path1/newfile2 cp file3 path1/newfile3 ...... cat file11 path2/file21 path1/newfile11 cat file12 path2/file22 path1/newfile12 cat file13 path2/file23... (5 Replies)
Discussion started by: yifangt
5 Replies

2. Shell Programming and Scripting

Bash-awk to process thousands of files

Hi to all, I have thousand of files in a folder with names with format "FILE-YYYY-MM-DD-HHMM" for what I want to send the following AWK command awk '/Code.*/' FILE-2014* I'd like to separate all files that have the same date to a folder named with the corresponding date. For example, if I... (7 Replies)
Discussion started by: Ophiuchus
7 Replies

3. Shell Programming and Scripting

Search for patterns in thousands of files

Hi All, I want to search for a certain string in thousands of files and these files are distributed over different directories created daily. For that I created a small script in bash but while running it I am getting the below error: /ms.sh: xrealloc: subst.c:5173: cannot allocate... (17 Replies)
Discussion started by: danish0909
17 Replies

4. Shell Programming and Scripting

Parallelize a task that have for

Dear all, I'm a newbie in programming and I would like to know if it is possible to parallelize the script: for l in {1..1000} do cut -f$l quase2 |tr "\n" "," |sed 's/$/\ /g' |sed '/^$/d' >a_$l.t done I tried: for l in {1..1000} do cut -f$l quase2 |tr "\n" "," |sed 's/$/\ /g' |sed... (7 Replies)
Discussion started by: valente
7 Replies

5. Shell Programming and Scripting

How to calculate mean in AWK? line by line several files, thousands of lines

I'm kinda stuck on this one, I have 7 files with 30.000 lines/file like this 050 0.023 0.504336 050 0.024 0.529521 050 0.025 0.538908 050 0.026 0.537035 I want to find the mean line by line of the third column from the files named like this: Stat-f-1.dat .... Stat-f-7.dat Stat-s-1.dat... (8 Replies)
Discussion started by: AriasFco
8 Replies

6. UNIX for Advanced & Expert Users

Copying Thousands of Tiny or Empty Files?

There is a procedure I do here at work where I have to synchronize file systems. The source file system always has three or four directories of hundreds of thousands of tiny (1k or smaller) or empty files. Whenever my rsync command reaches these directories, I'm waiting for hours for those files... (3 Replies)
Discussion started by: deckard
3 Replies

7. Shell Programming and Scripting

thousands separator

Hi, Trying to represent a number with thousands separator in AWK: echo 1 12 123 1234 12345 123456 1234567 | awk --re-interval '{print gensub(/(])(]{3})/,"\\1,\\2","g")}' 1 12 123 1,234 1,2345 1,23456 1,234567 any idea what is wrong here ? (11 Replies)
Discussion started by: ynixon
11 Replies

8. Shell Programming and Scripting

trnsmiting thousands ftp files and get an error message

Im transmiting thousands ftp files to a server, when type the command mput *, an error comes and say. args list to long. set to I. So ihave to transmit them in batch or blocks, but its too sloww. what shoul i do?. i need to do a program, or with a simple command i could solve the problem? (3 Replies)
Discussion started by: alexcol
3 Replies

9. Shell Programming and Scripting

Finding a specific pattern from thousands of files ????

Hi All, I want to find a specific pattern from approximately 400000 files on solaris platform. Its very heavy for me to grep that pattern to each file individually. Can anybody suggest me some way to search for specific pattern (alpha numeric) from these forty thousand files. Please note that... (6 Replies)
Discussion started by: aarora_98
6 Replies

10. UNIX for Advanced & Expert Users

Multiple (thousands) of Cron Instances

Hey all, I have a box running SUSE SLES 8 and in the past few months the box will randomly spawn thousands of instances of /USR/SBIN/CRON to the point where the box will lock up entirely. Upwards of 14000 instances! I imagine it's using up all of the available files that can be opened at one... (10 Replies)
Discussion started by: sysera
10 Replies
Login or Register to Ask a Question