Waiting for an arbitrary background process (limiting number of jobs running)
Hi,
I'm trying to write a script to decompress a directory full of files. The decompression commands can run in the background, so that many can run at once. But I want to limit the number running at any one time, so that I don't overload the machine.
Something like this:
At the marked spot, I want to wait for one of my background processes to complete. I don't mind which one, but I do want to wait for just one.
wait doesn't work, as it waits for all jobs to complete. On the other hand, wait N doesn't work, because I don't know which job will finish first.
I could use trap "..." 20, but I'd need to be able to pause my script at the XXX line and be able to resume it via the "..." from the trap command. I can't think of a way of doing this ("suspend" in bash might work, but really I need this to work in ksh - I'm not sure the server this will ultimately run on has bash installed).
nope, I don't think you'll do it in ksh.
you'll need waitpid.
I would, get the list of files,
divide by the number of processes you want
and send that many files off via xargs
e.g. 10
i don't know how set will react if you have hundreds of files
you might get 'command line too long'
Thanks, that's an approach I hadn't thought of. One thing it doesn't allow me to do is to report progress - something I'd thought of adding to my original approach was to add a "printf '.'" whenever I started a new decompress. But that's just a nice-to-have - your suggestion gets the job done.
I'd still be interested in any other possibilities that anyone can suggest - this is my first venture into anything more complicated than very basic scripts, and I'm learning a lot I didn't know!
I think this is too much for a shell and using C or at least some real scripting language may be required here. However I'd love to see a solution for shell if possible.
I think this is too much for a shell and using C or at least some real scripting language may be required here. However I'd love to see a solution for shell if possible.
I tried a perl solution and got really bogged down because I couldn't find an easy way of running a background command (disclaimer: it's a VERY long time since I used perl, but I don't have Python on the box I'm working with :-() Messing round with
seems fraught with potential issues that I don't understand (for a start, it doesn't handle shell metacharacters - should I use exec "sh", "-c", @_ or some similar incantation?)
If someone can confirm a decent Perl equivalent of the shell
I'll see what I can do with the rest of it...
My basic idea for solution would be to spawn initial N workes and save their pids to some table, then sleep 1 and see which of the PIDs are still alive. For those who are not - spawn next worker and save pid in place of the old one. Repeat until job is done.
The problem with counting with pgrep is that you will take in account the processess that may not be related with the script (any other user can run their own gzip, right?).
Hi,
I have this simple c program that creates duplicate process with fork():
#include <sys/types.h>
main()
{
if (fork() == 0)
while(1);
else
while(1);
}
I tried running it in the background
gcc -o test first.c
test &
And I got this list of running process: (4 Replies)
Hi ,
I want to see all the background process that are running in unix box machine...please guide me is there any specific command for that..since I am executing some scripts at background..!!:confused: (1 Reply)
Hello,
I am running GNU bash, version 3.2.39(1)-release (x86_64-pc-linux-gnu). I have a specific question pertaining to waiting on jobs run in sub-shells, based on the max number of parallel processes I want to allow, and then wait... (1 Reply)
Hi,
I have on shell script which internally calls more than one scripts which run in background.
These scripts cannot be modified to run in foreground.
eg. myscript.sh -> bulk_launcher.sh -> acq_launcher.sh
-> bulk_loader.sh
I want the calling shell script myscript.sh to wait till the... (7 Replies)
Hello,
I am a novice shell script programmer. And facing this problem any help is appreciated.
I m writing a shell script and running few commands in it background as I have to run them simultaneously.
Sample code :
sql_prog &
sql_prog &
sql_prog &
echo "Process Completed"
Here... (2 Replies)
Hi All,
I have requirement. I am running a job every 30mins. before starting the process, i need to check the process, if the process is still running then i need not trigger the process again, if it is not running then trigger the process again. I am using cron to trigger the shell script. Can... (7 Replies)
i need to execute 5 jobs at a time in background and need to get the exit status of all the jobs i wrote small script below , i'm not sure this is right way to do it.any ideas please help.
$cat run_job.ksh
#!/usr/bin/ksh
####################################
typeset -u SCHEMA_NAME=$1
... (1 Reply)
I have the following sample script to run a script the jobs with the same
priority(in this case field3) in parallel; wait for the jobs to finish
and run the next set of jobs in parallel.When all the lines are read
exit the script.
I have the following script which is doing evrything I want... (1 Reply)
I'm trying to install a solaris 9 patch cluster and when I try to use & to run in background it won't allow me to enter in my sudo password so it fails the install and sudo auth. Does Solaris not have screen like linux? If & will work what am I doing wrong?
sudo ./install_cluster -q &
is... (3 Replies)