I have a requirement where jobs/scripts need to be run in the background.The concern here is there are around 20 scripts which need to be run in the bg.Does running all the 20 scripts/job at the same time in bg consumes much sever-utilization. If so wot would be the efficient way to run the jobs parallely with less server consumption. Im using ksh.Pls throw some light.
The bottom line is that it depends on exactly what those 20 scripts are doing. If they are all starting a process which consumes a large amount of memory on the machine, or each is doing a large file transfer, then yes it is possible that you will impact the machine by running all 20 at the same time. On the other hand, these scripts might not significantly contribute to the load on the machine and running them all concurrently will be fine.
If you decide that they must all not be executed concurrently, then you could build a simplistic 'starter' into your driving script that starts the maximum that you deem safe to run concurrently, waits for those to finish, and starts more; this would continue until all scripts have been executed. If you need more sophisticated scheduling (dependencies and/or load average consideration), then you'll need to think about using a scheduler to manage the jobs.
This may be vague, but without knowing any details about the potential resource consumption of the scripts (CPU, memory, I/O), or the hardware (installed memory, number of cores, etc.), it's difficult to guess.
You can run them with 'nice' to prevent them from lagging the system too much. It won't prevent them from consuming resources, but other things will get priority over them for CPU time.
Several warnings for you:
1. Never run a job in the background without storing its PID or waiting for it to end
2. Never run a process in the background and quit
3. If your shell script gets too complicated then it might be a good idea to use a programming language like perl, python, java, etc.
I have seen some "framework" that executed:
As a result user called a command and then had a disowned process running in the system. It might never end. It might do whatever it wants. You cannot tell when it finished.
As a workaround people are using "ps -ef|grep scriptname" but it fails for long paths as ps won't show full path then.
Now the performance - if there are only 20 processes then you should'n worry about it unless your server is very limited (like 4MiB of RAM, 50MHz CPU). Everything depends on: how often does it happen, what extra operations are performed for every process (ex. loading of a complex environment) and how many resources are consumed by the processes. It might be that your server is not capable of running even a single process like that (ex. the process might try to backup whole internet ).
If you run the jobs in parallel then usually the purpose of this is that they can consume more server resources, so that more work gets done in less time. Just test he effects first on a test computer as you would do with anything before you run it on a production server. You can tune it by running fewer/more processes in parallel or by scheduling the processes to a time period when there is little interactive use of the server...
I have multiple jobs and each job dependent on other job.
Each Job generates a log and If job completed successfully log file end's with JOB ENDED SUCCESSFULLY message and if it failed then it will end with JOB ENDED with FAILURE.
I need an help how to start.
Attaching the JOB dependency... (3 Replies)
Hello,
I am running GNU bash, version 3.2.39(1)-release (x86_64-pc-linux-gnu). I have a specific question pertaining to waiting on jobs run in sub-shells, based on the max number of parallel processes I want to allow, and then wait... (1 Reply)
I need to process 50 sqlplus scripts which are listed in a text file. I need to develop a shell script that'll read this file and run these sqlplus scripts. At any point of time, the number of sqlplus scripts running shouldn't exceed 6. If any of the sqlplus scripts completes successfully then... (17 Replies)
I have a list with four dates
say load_date.lst contains
2010-01-01 2010-01-31
2010-03-01 2010-03-31
2010-05-01 2010-05-31
2010-07-01 2010-07-31
And I have directory /lll/src/sql with set of sql's
1_load.sql
2_load.sql
3_load.sql
I want to run the sql'in series with respective to... (3 Replies)
how can i process jobs parallel with conditions below.
Script1.ksh
Script2.ksh
Script3.ksh
Script4.ksh
Script5.ksh
Script6.ksh
Script7.ksh
Script8.ksh
Script9.ksh
Script10.ksh
After successful completion of Script1.ksh I need to run Script7.ksh.
After successful... (4 Replies)
Hi all,
How do i run a command in parallel 50 times and capturing the result of each run in a separate file
Eg: myApp arg1 > run1.txt
myApp arg1 > run2.txt
:::::::::::::::::::::::::
:::::::::::::::::::::::::
myApp arg1 > run50.txt
The above way is sequential.
... (3 Replies)
i need to execute 5 jobs at a time in background and need to get the exit status of all the jobs i wrote small script below , i'm not sure this is right way to do it.any ideas please help.
$cat run_job.ksh
#!/usr/bin/ksh
####################################
typeset -u SCHEMA_NAME=$1
... (1 Reply)
Hi,
Could any one please explain the difference between DataStage server edition jobs and DS parallel extender jobs...?
In which scenarios or application areas do we use either of these jobs.?
Regards
Suresh (0 Replies)
In a korn shell script, how can I run several processes in parallel at the same time?
For example, I have 3 processes say p1, p2, p3
if I call them as
p1.ksh
p2.ksh
p3.ksh
they will run after one process finishes. But I want to run them in parallel and want to display "Process p1... (3 Replies)