With GNU xargs, you can use the -P option to run jobs in parallel. So
would run your program with each line of input provided as the arguments to myprogram, and 4 jobs would be started at a time.
To distribute among several Linux nodes, you can build a cluster, as Neo suggested.
OR you can use ssh/rsh to distribute run the jobs on the other hosts. First, make a file containing the hostnames of your cluster. If a host has 2 CPUs, have the hostname entry twice. If it has 4 cores, have 4 entries of that hostname.
From here you can go in different directions, but ultimately, you run one rsh/ssh process her line in this hosts file.
Hi Neo and otheus,
Thanks for ur valuable replies....
I am currently exploring all options suggested here....
Hello,
I wish to run parallel process forked from one script.
Currently I doing is submit them in background.
For example:
---------------------------------------------------------------
#!/usr/bin/ksh
process1 &
process2 &
process3 &
.....
.....
#here I check for completion of... (4 Replies)
Hi,
Is there any way to run parallel jobs using make command?
I am using non-GNU make utility on AIX 4.3.
I want to run 2 jobs symultaneously using make utility.
Thanks.
Suman (0 Replies)
Hi All,
At present I am using a UNIX Script which is running a set of JOBS. These JOBS are to be repeated for 20 times, means the same set of JOBS are repeated by just passing different arguments (From 1 to 20).
Is there any way by which I can execute them parallel?
At present its all... (4 Replies)
how can i process jobs parallel with conditions below.
Script1.ksh
Script2.ksh
Script3.ksh
Script4.ksh
Script5.ksh
Script6.ksh
Script7.ksh
Script8.ksh
Script9.ksh
Script10.ksh
After successful completion of Script1.ksh I need to run Script7.ksh.
After successful... (4 Replies)
I am haveing 2 scripts, 1st script calls 2ed script for each parameter.(parameters are kept in a different txt file)
1st script
for x in `cat Export_Tables_List.txt`
do
sh Exp_Table.sh $x &
done
echo -e "1) following tables are successfully exported : \n" > temp
cat... (1 Reply)
Since there've been a few requests for a method to execute commands on multiple CPUs (logical or physical), with various levels of shell-, make-, or Perl-based solutions, ranging from well-done to well-meant, and mostly specific to a certain problem, I've started to write a C-based solution... (4 Replies)
Hi All,
We have a table that has to store around 80-100 million records. The table is partitioned by a column called Market Code. There are 30 partitions each corresponding to one MRKT_CD.
The source of this table is a join between 3-4 other tables. We are loading this table through SQLPLUS... (2 Replies)
I have few very huge files ~ 2 Billion rows of 130 column(CDR data) in a folder, I have written shell script need to read on each file in a folder and will create a new files based on some logic.
But problem is it's taking time to create a new file due to the size , So i dont want to corrupt... (6 Replies)
I have a requirement (in a shell script) to connect to several servers at once and execute a series of commands.
I am aware that ssh can be used for sequential execution. But since most of the commands that I need to execute take a long time, I have to go for the parallel option.
Is there... (2 Replies)
i have say x number of procedure to run, ie i have one procedure which accepts variable and i need that to run in parallel and capture the error code if in case if it fails through the unix.
sqlplus <EOF> exec test_t (abc,124); </EOF>
sqlplus <EOF> exec test_t (abc,125); </EOF>
sqlplus <EOF>... (2 Replies)
Discussion started by: ATWC
2 Replies
LEARN ABOUT DEBIAN
cd-hit-2d-para
CD-HIT-2D-PARA.PL(1) User Commands CD-HIT-2D-PARA.PL(1)NAME
cd-hit-2d-para.pl - divide a big clustering job into pieces to run cd-hit-2d or cd-hit-est-2d jobs
SYNOPSIS
cd-hit-2d-para.pl options
DESCRIPTION
This script divide a big clustering job into pieces and submit jobs to remote computers over a network to make it parallel. After
all the jobs finished, the script merge the clustering results as if you just run a single cd-hit-2d or cd-hit-est-2d.
You can also use it to divide big jobs on a single computer if your computer does not have enough RAM (with -L option).
Requirements:
1 When run this script over a network, the directory where you
run the scripts and the input files must be available on all the remote hosts with identical path.
2 If you choose "ssh" to submit jobs, you have to have
passwordless ssh to any remote host, see ssh manual to know how to set up passwordless ssh.
3 I suggest to use queuing system instead of ssh,
I currently support PBS and SGE
4 cd-hit-2d cd-hit-est-2d cd-hit-div cd-hit-div.pl must be
in same directory where this script is in.
Options
-i input filename for 1st db in fasta format, required
-i2 input filename for 2nd db in fasta format, required
-o output filename, required
--P program, "cd-hit-2d" or "cd-hit-est-2d", default "cd-hit-2d"
--B filename of list of hosts, requred unless -Q or -L option is supplied
--L number of cpus on local computer, default 0 when you are not running it over a cluster, you can use this option to divide a big
clustering jobs into small pieces, I suggest you just use "--L 1" unless you have enough RAM for each cpu
--S Number of segments to split 1st db into, default 2
--S2 Number of segments to split 2nd db into, default 8
--Q number of jobs to submit to queue queuing system, default 0 by default, the program use ssh mode to submit remote jobs
--T type of queuing system, "PBS", "SGE" are supported, default PBS
--R restart file, used after a crash of run
-h print this help
More cd-hit-2d/cd-hit-est-2d options can be speicified in command line
Questions, bugs, contact Weizhong Li at liwz@sdsc.edu
cd-hit-2d-para.pl 4.6-2012-04-25 April 2012 CD-HIT-2D-PARA.PL(1)