Sponsored Content
Special Forums UNIX and Linux Applications High Performance Computing Parallel Execution on Multiple System Post 302285241 by 123an on Sunday 8th of February 2009 01:44:06 AM
Old 02-08-2009
More information regarding Parallel Execution

Hi quirkasaurus,

Thanks for ur reply..Smilie

As per ur reply i would like to give some more details about my problem,
I have 500 different set of arguments in a file(say list.txt). These arguments i need to pass to an executable (or an application; say "./applictn.out") which wil run and does the job taking each arguments. That means i will get 500 sets of outputs on executing whole system.
If I execute this using script serially(means one after other) execution wil take more time. so within a system i can execute this in parallel (i.e making them as background process) using xargs. Like
Syntax:xargs <utility> <arguments>
Given: xargs ./applictn.out list.txt..

This wil be executed in parallel on single system..

Now my problem is if i want to execute these sets(set of 500 arguments) on different linux systems, which should run in parallel. So that i should get output at lower amount of time.
parallelism i mean to say is, if mahine1 should take say 150 sets and start processing..
Machine 2 should take say 200 sets and starts processing..
machine 3 should take remaining in sets and should start working on it..

All machines should work in parallel...

Thanks..
123an





Quote:
Originally Posted by quirkasaurus
not enough info for me...

are you talking about the same script with 500 different sets of arguments?
do you have the arguments already somewhere or
will you generate them?
should the jobs run in series once on their individual machine?
or can they run concurrently with a maximum threshold?

is this just for benchmarking? or is this going to be a permanent run and
everything should take about the same time?

i'm thinking . . . . just create all the command lines....
dump them all into a file...

then have another script, read this file,
divide them equally into scripts for each machine....
rcp these scripts to the respective machines,
and kick 'em off.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Parallel Execution

Hello, I wish to run parallel process forked from one script. Currently I doing is submit them in background. For example: --------------------------------------------------------------- #!/usr/bin/ksh process1 & process2 & process3 & ..... ..... #here I check for completion of... (4 Replies)
Discussion started by: RishiPahuja
4 Replies

2. AIX

Make : parallel execution

Hi, Is there any way to run parallel jobs using make command? I am using non-GNU make utility on AIX 4.3. I want to run 2 jobs symultaneously using make utility. Thanks. Suman (0 Replies)
Discussion started by: suman_jakkula
0 Replies

3. Shell Programming and Scripting

Parallel Job Execution

Hi All, At present I am using a UNIX Script which is running a set of JOBS. These JOBS are to be repeated for 20 times, means the same set of JOBS are repeated by just passing different arguments (From 1 to 20). Is there any way by which I can execute them parallel? At present its all... (4 Replies)
Discussion started by: Prashantckc
4 Replies

4. Shell Programming and Scripting

Conditional execution and parallel jobs

how can i process jobs parallel with conditions below. Script1.ksh Script2.ksh Script3.ksh Script4.ksh Script5.ksh Script6.ksh Script7.ksh Script8.ksh Script9.ksh Script10.ksh After successful completion of Script1.ksh I need to run Script7.ksh. After successful... (4 Replies)
Discussion started by: ford2020
4 Replies

5. Shell Programming and Scripting

Parallel execution of script not syncronize

I am haveing 2 scripts, 1st script calls 2ed script for each parameter.(parameters are kept in a different txt file) 1st script for x in `cat Export_Tables_List.txt` do sh Exp_Table.sh $x & done echo -e "1) following tables are successfully exported : \n" > temp cat... (1 Reply)
Discussion started by: sbmk_design
1 Replies

6. Programming

Parallel Execution of Programs

Since there've been a few requests for a method to execute commands on multiple CPUs (logical or physical), with various levels of shell-, make-, or Perl-based solutions, ranging from well-done to well-meant, and mostly specific to a certain problem, I've started to write a C-based solution... (4 Replies)
Discussion started by: pludi
4 Replies

7. UNIX for Advanced & Expert Users

Parallel Execution of Command

Hi All, We have a table that has to store around 80-100 million records. The table is partitioned by a column called Market Code. There are 30 partitions each corresponding to one MRKT_CD. The source of this table is a join between 3-4 other tables. We are loading this table through SQLPLUS... (2 Replies)
Discussion started by: jerome_rajan
2 Replies

8. Shell Programming and Scripting

How to make parallel execution on folder ?

I have few very huge files ~ 2 Billion rows of 130 column(CDR data) in a folder, I have written shell script need to read on each file in a folder and will create a new files based on some logic. But problem is it's taking time to create a new file due to the size , So i dont want to corrupt... (6 Replies)
Discussion started by: rspwilliam
6 Replies

9. Homework & Coursework Questions

Parallel execution on multiple servers in UNIX

I have a requirement (in a shell script) to connect to several servers at once and execute a series of commands. I am aware that ssh can be used for sequential execution. But since most of the commands that I need to execute take a long time, I have to go for the parallel option. Is there... (2 Replies)
Discussion started by: sneha1887
2 Replies

10. UNIX for Beginners Questions & Answers

Parallel execution of Oracle procedure in UNIX

i have say x number of procedure to run, ie i have one procedure which accepts variable and i need that to run in parallel and capture the error code if in case if it fails through the unix. sqlplus <EOF> exec test_t (abc,124); </EOF> sqlplus <EOF> exec test_t (abc,125); </EOF> sqlplus <EOF>... (2 Replies)
Discussion started by: ATWC
2 Replies
xjobs(1)							   User Commands							  xjobs(1)

NAME
xjobs - construct command line and execute jobs in parallel SYNOPSIS
xjobs [options] [utility [argument ...]] DESCRIPTION
xjobs reads job descriptions line by line and executes them in parallel. It limits the number of parallel executing jobs and starts new jobs when jobs finish. Therefore, it combines the arguments from every input line with the utility and arguments given on the command line. If no utility is given as an argument to xjobs, then the first argument on every job line will be used as utility. To execute utility xjobs searches the directories given in the PATH environment variable and uses the first file found in these directories. xjobs is most useful on multiprocessor machines when one needs to execute several time consuming commands that could possibly be run in parallel. With xjobs this can be achieved easily, and it is possible to limit the load of the machine to a useful value. It works similar to xargs, but starts several processes simultaneously and gives only one line of arguments to each utility call. By using I/O redirectors the standard input, output, and error stream of executed jobs can be redirected. Use < to redirect standard input, > to redirect standard output, >! to redirect standard output and overwrite an existing file, >> to append standard output to an existing file, >& to redirect both standard output and standard error output to the same file, and >>& to append both standard output and standard error output to the same file. If passed on the command line, these operators specify the default I/O redirection that can be overwritten by specifying another redirector to a specific job on its argument line. After all these operators a filename is expected. See EXAMPLES below for an example. If you need more advanced shell features than the redirection operators supported by xjobs, then use as utility a shell of your preference. Every job line can be preceeded by a "cd directory;" command that tells xjobs in which directory the job shall be executed. For every line this can only be used once. For more complex scripting, please pass the line to execute to a shell of your choice. xjobs constructs the arguments of the jobs to execute from each input line. Each input line will create a seperate job, whereas newline character are handled as regular whitespace by xargs. To be able to include whitespace charakters in arguments, either preceed them with a backslash or quote them with single or doublequote charakters. A backslash charakter preceeding a newline will make xjobs ignore the new- line character, thus giving you the ability to pass arguments for a single job across multiple lines. To include quotation marks in quoted arguments, preceed them with a backslash. Lines passed to xjobs beginning with a # charakter are interpreted as comments. Finally, xjobs also includes a mechanism for serializing the execution. Like this it is possible to parallelize independent jobs and sequence jobs that have a dependency. This can be achieved by inserting a line that only consists of two percentage charakters in sequence (%%). All jobs before this sequence point are executed at the requested number of jobs in parallel. When hitting the sequence point xjobs waits for all processes to finish and then continues starting jobs that follow the sequence point. When passing a named pipe (i.e. a file name created by mkfifo) via option -s as an input, xjobs will close and reopen the fifo when reach- ing end-of-file. Like this it is possible to setup an xjobs server and sending jobs to this server from muliple programs. See section EXAM- PLES below for an example. OPTIONS
-j <jobs> Sets the maximum number of jobs that are started in parallel. The default value is to limit the number executing jobs is equal to the number of online processors in the system. If the number passed as <jobs> is followed by an 'x' charakter (e.g. 2.5x), the value is multiplied with the number of online processors before setting the job limit. I.e. having a machine with 4 online processors and passing 2.5x as an argument to option -j will yield a joblimit of 10 jobs. -s <script> Use file script instead of the standard input to read the job descriptions. -n Redirect standard output and standard error output of executed jobs to /dev/null. -l <num> Combine the arguments of <num> input lines for a single job. -p Start jobs interactively, prompting the user. -q <num> Limits the number of queued jobs to num elements. Normally xjobs reads in jobs from standard input or the give script and queues them if they cannot be started at once. With this option, xjobs will stop reading as soon as num jobs are queued and restart reading when a new job has been started. Like this xjobs allocates less memory. Use this option, if you pass huge number of jobs to xjobs, to limit memory consumption. It can also increase performance of xjobs, but be sure that jobs get fed fast enough to xjobs. -1 Pass one argument per job, which is expected to be terminated by a new-line character. No argument parsing is performed. That way it is more easy to process jobs where arguments may include whitespace character or other tokens that influence argument parsing. -0 Same as -1, but as a job and argument termination character a null-character () is expected instead of a new-line character. That way also arguments with new-line character can be processed without escape sequences. -V Print the version number of xjobs and exit. -v <level> Set verbosity of xjobs to level. Valid leves are: 0=silent, 1=error, 2=warning, 3=info, 4=debug. The default level of verbosity is 3. EXAMPLES
If you have a lot of .zip files that you want to extract, then use xjobs like this: $ ls -1 *.zip | xjobs unzip If you want to do the same without getting the output of each unzip task on your terminal, then try this: $ ls -1 *.zip | xjobs -n unzip To gzip all *.bak files in a given directory hierarchy, use it the following way: $ find . -name '*.bak' | xjobs gzip To generate index files for a set of *.jar files, you can use the redirection feature of xjobs, and do the following: $ ls -1 *.jar | sed 's/(.*)/1 > 1.idx/' | xjobs jar tf If you also want to capture the error output, than use >& instead of >. You can also use it to execute several different commands. Therefore, write a script file that contains every job you want to execute and pass it to xjobs with the option -s: $ cat - > script unzip my.zip tar xf my.tar lame --silent my.wav my.mp3 crypt notsecret < mydata > secretfile ^D $ xjobs -s script To be able to queue up jobs from multiple sources with xjobs, use a named pipe and pass it explicitly as input script. Then write the jobs to the named pipe: $ mkfifo /var/run/my_named_pipe $ xjobs -s /var/run/my_named_pipe & $ echo unzip 1.zip >> /var/run/my_named_pipe $ echo tar cf /backup/myhome.tar /home/me >> /var/run/my_named_pipe ENVIRONMENT VARIABLES
PATH Determines the location of command. AUTHORS
Thomas Maier-Komor <thomas@maier-komor.de> Donations via PayPal are welcome! HOMEPAGE
http://www.maier-komor.de/xjobs.html LICENSE
GNU General Public License Version 2 SEE ALSO
xargs(1) Thomas Maier-Komor 20100915 xjobs(1)
All times are GMT -4. The time now is 09:37 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy