If you're on a multi-user-system it may be not such a good idea to parallel i/o-heavy processes because you may completely eat up the available i/o of your machine and cause extreme loads at within your server. This could slow down the whole server very greatly.
This is exactly when I use bash-only at the background all at once, and I received warning from the admin for causing trouble with the server! So I want to restrict the jobs less than the maximum cores of the server.
At this moment I do not care too much about the efficiency yet, although the cat/cp do eat a lot the I/O capacity.
Yes, the ln -s part is not the big deal. The real challenge is the cat and cp parts, where big files are involved to make the processes slow that's why I need parallel.
My real code is pretty much the same as the example, and here is the first several rows for the portion with cat:
I was thinking the option is straight-forward, and my impression is parallel is for jobs with similar pattern of the scripts/options of the commands. I went to gnu website and other parallel tutorial, but could not spot the corresponding part for this case.
Also I found this type of work is quite common for me to process hundreds of samples, which takes at least a couple of hours when I used to do it one-by-one and let the run goes overnight. This is not good if I want the results right away, which can be achievable using 16~20 cores by parallel if the scaling is proportional as 16~20x.
Thanks again if there is option for my situation that I may have missed.
---------- Post updated at 06:51 PM ---------- Previous update was at 05:47 PM ----------
Did an experiment and found out the simple answer for my example is
Here is my test.
The order of the echoed strings is what I expected!
Using parallel took 1m56.053s, whereas bash-only took 8m45.042s as it is sequential sum of each process.
And I appreciate any insight/comment if I missed anything!
i need to execute 5 jobs at a time in background and need to get the exit status of all the jobs i wrote small script below , i'm not sure this is right way to do it.any ideas please help.
$cat run_job.ksh
#!/usr/bin/ksh
####################################
typeset -u SCHEMA_NAME=$1
... (1 Reply)
I want to log into a remote server transfer over a new config and then backup the existing config, replace with the new config.
I am not sure if I can do this with BASH scripting.
I have set up password less login by adding my public key to authorized_keys file, it works.
I am a little... (1 Reply)
Hi All,
I am trying to run this script. I have a small problem:
each "./goada.sh" command when done produces three files (file1, file2, file3) then they are moved to their respective directory as can be seem from this script snippet here.
The script goada.sh sends some commands for some... (1 Reply)
Status quo is, within a web application, which is coded completely in php (not by me, I dont know php), I have to fill out several fields, and execute it manually by clicking the "go" button in my browser, several times a day.
Thats because:
The script itself pulls data (textfiles) from a... (3 Replies)
I need to find a smarter way to process about 60,000 files in a single directory.
Every night a script runs on each file generating a output on another directory; this used to take 5 hours, but as the data grows it is taking 7 hours.
The files are of different sizes, but there are 16 cores... (10 Replies)
Hello,
I am running GNU bash, version 3.2.39(1)-release (x86_64-pc-linux-gnu). I have a specific question pertaining to waiting on jobs run in sub-shells, based on the max number of parallel processes I want to allow, and then wait... (1 Reply)
Dear all,
I'm a newbie in programming and I would like to know if it is possible to parallelize the script:
for l in {1..1000}
do
cut -f$l quase2 |tr "\n" "," |sed 's/$/\
/g' |sed '/^$/d' >a_$l.t
done
I tried:
for l in {1..1000}
do
cut -f$l quase2 |tr "\n" "," |sed 's/$/\
/g' |sed... (7 Replies)
Hello,
the bulk of my work is run by scripts. An example is as such:
#!/bin/bash
awk '{print first line}' Input.in > Intermediate.ter
awk '{print second line}' Input.in > Intermediate_2.ter
command Intermediate.ter Intermediate_2.ter > Output.out
It works the way I want it to, but it's not... (1 Reply)
I have multiple jobs and each job dependent on other job.
Each Job generates a log and If job completed successfully log file end's with JOB ENDED SUCCESSFULLY message and if it failed then it will end with JOB ENDED with FAILURE.
I need an help how to start.
Attaching the JOB dependency... (3 Replies)
How to run several bash commands put in bash command line without needing and requiring a script file.
Because I'm actually a windows guy and new here so for illustration is sort of :
$ bash "echo ${PATH} & echo have a nice day!"
will do output, for example:... (4 Replies)