Sponsored Content
Top Forums Shell Programming and Scripting Help required with xargs to use sqlplus Post 302947705 by RudiC on Saturday 20th of June 2015 06:06:45 PM
Old 06-20-2015
I don't think xargs will run those programs in parallel without sending any of them into background.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Help with xargs

Hi there, I am trying to move around 3000 files from one directory to another. The mv command is complaining from too many arguments. I tried to use the xargs command but with no luck. Could some body provide help? Regards (4 Replies)
Discussion started by: JimJim
4 Replies

2. Shell Programming and Scripting

why we use xargs..

hi , can anyone help me by saying why we use xargs.. is it acing like a place holder..? thanks, Krips. (3 Replies)
Discussion started by: kripssmart
3 Replies

3. UNIX for Advanced & Expert Users

xargs -P

I discovered that GNU's xargs has a -P option to allow its processes to run in parallel. Great! Is this a GNU thing, or is it supported by other platforms as well? (4 Replies)
Discussion started by: otheus
4 Replies

4. Shell Programming and Scripting

Getting required fields from a test file in required fromat in unix

My data is something like shown below. date1 date2 aaa bbbb ccccc date3 date4 dddd eeeeeee ffffffffff ggggg hh I want the output like this date1date2 aaa eeeeee I serached in the forum but didn't find the exact matching solution. Please help. (7 Replies)
Discussion started by: rdhanek
7 Replies

5. Shell Programming and Scripting

Using xargs

hi i just want to know that how do we use xargs command to find files which are greater than specified memory in a given directory (6 Replies)
Discussion started by: sumit the cool
6 Replies

6. Shell Programming and Scripting

Help in using xargs

Hi, I have a requirement to RCP the files from remote server to local server. Also the RCP has to run in parallel. However using 'xargs' retrives 2 file names during each loop. How do we restrict to only one file name using xargs and loop till remaining files. I use the below code for... (2 Replies)
Discussion started by: senthil3d
2 Replies

7. Shell Programming and Scripting

Help with xargs

Using the bash shell I'm trying to either create a command for the command line or a script that will show netstat info for a given process name. Here is an example of what I'm trying to do:$ ps aux |grep catalina |grep -v grep | awk '{print $2}' 5132 $ netstat -nlp |grep 5132 (Not all processes... (11 Replies)
Discussion started by: axiopisty
11 Replies

8. Windows & DOS: Issues & Discussions

Help required for Running SQLPLUS command from Bat file

Hello All, Good Afternoon. I am new to this platform and I need one small help regarding running a SQL file from Bat file. Below is what I am doing, 1. I placed the below command in one Bat file. start putty.exe -ssh user@host -pw pwd -m C:\2.txt 2. In 2.txt, I have below command. ... (3 Replies)
Discussion started by: PavanPatil
3 Replies

9. Shell Programming and Scripting

Sqlplus error - sqlplus -s <login/password@dbname> : No such file or directory

i am using bash shell Whenever i declare an array, and then using sqlplus, i am getting sqlplus error and return code 127. IFS="," declare -a Arr=($Variable1); SQLPLUS=sqlplus -s "${DBUSER}"/"${DBPASS}"@"${DBASE} echo "set head off ; " > ${SQLCMD} echo "set PAGESIZE 0 ;" >> ${SQLCMD}... (6 Replies)
Discussion started by: arghadeep adity
6 Replies

10. UNIX and Linux Applications

Problem on SQLplus command ""bash: sqlplus: command not found""

Hi all, i face an error related to my server ""it's running server"" when i use sqlplus command $ sqlplus bash: sqlplus: command not found the data base is up and running i just need to access the sqlplus to import the dump file as a daily backup. i already check the directory... (4 Replies)
Discussion started by: clerck
4 Replies
xjobs(1)							   User Commands							  xjobs(1)

NAME
xjobs - construct command line and execute jobs in parallel SYNOPSIS
xjobs [options] [utility [argument ...]] DESCRIPTION
xjobs reads job descriptions line by line and executes them in parallel. It limits the number of parallel executing jobs and starts new jobs when jobs finish. Therefore, it combines the arguments from every input line with the utility and arguments given on the command line. If no utility is given as an argument to xjobs, then the first argument on every job line will be used as utility. To execute utility xjobs searches the directories given in the PATH environment variable and uses the first file found in these directories. xjobs is most useful on multiprocessor machines when one needs to execute several time consuming commands that could possibly be run in parallel. With xjobs this can be achieved easily, and it is possible to limit the load of the machine to a useful value. It works similar to xargs, but starts several processes simultaneously and gives only one line of arguments to each utility call. By using I/O redirectors the standard input, output, and error stream of executed jobs can be redirected. Use < to redirect standard input, > to redirect standard output, >! to redirect standard output and overwrite an existing file, >> to append standard output to an existing file, >& to redirect both standard output and standard error output to the same file, and >>& to append both standard output and standard error output to the same file. If passed on the command line, these operators specify the default I/O redirection that can be overwritten by specifying another redirector to a specific job on its argument line. After all these operators a filename is expected. See EXAMPLES below for an example. If you need more advanced shell features than the redirection operators supported by xjobs, then use as utility a shell of your preference. Every job line can be preceeded by a "cd directory;" command that tells xjobs in which directory the job shall be executed. For every line this can only be used once. For more complex scripting, please pass the line to execute to a shell of your choice. xjobs constructs the arguments of the jobs to execute from each input line. Each input line will create a seperate job, whereas newline character are handled as regular whitespace by xargs. To be able to include whitespace charakters in arguments, either preceed them with a backslash or quote them with single or doublequote charakters. A backslash charakter preceeding a newline will make xjobs ignore the new- line character, thus giving you the ability to pass arguments for a single job across multiple lines. To include quotation marks in quoted arguments, preceed them with a backslash. Lines passed to xjobs beginning with a # charakter are interpreted as comments. Finally, xjobs also includes a mechanism for serializing the execution. Like this it is possible to parallelize independent jobs and sequence jobs that have a dependency. This can be achieved by inserting a line that only consists of two percentage charakters in sequence (%%). All jobs before this sequence point are executed at the requested number of jobs in parallel. When hitting the sequence point xjobs waits for all processes to finish and then continues starting jobs that follow the sequence point. When passing a named pipe (i.e. a file name created by mkfifo) via option -s as an input, xjobs will close and reopen the fifo when reach- ing end-of-file. Like this it is possible to setup an xjobs server and sending jobs to this server from muliple programs. See section EXAM- PLES below for an example. OPTIONS
-j <jobs> Sets the maximum number of jobs that are started in parallel. The default value is to limit the number executing jobs is equal to the number of online processors in the system. If the number passed as <jobs> is followed by an 'x' charakter (e.g. 2.5x), the value is multiplied with the number of online processors before setting the job limit. I.e. having a machine with 4 online processors and passing 2.5x as an argument to option -j will yield a joblimit of 10 jobs. -s <script> Use file script instead of the standard input to read the job descriptions. -n Redirect standard output and standard error output of executed jobs to /dev/null. -l <num> Combine the arguments of <num> input lines for a single job. -p Start jobs interactively, prompting the user. -q <num> Limits the number of queued jobs to num elements. Normally xjobs reads in jobs from standard input or the give script and queues them if they cannot be started at once. With this option, xjobs will stop reading as soon as num jobs are queued and restart reading when a new job has been started. Like this xjobs allocates less memory. Use this option, if you pass huge number of jobs to xjobs, to limit memory consumption. It can also increase performance of xjobs, but be sure that jobs get fed fast enough to xjobs. -1 Pass one argument per job, which is expected to be terminated by a new-line character. No argument parsing is performed. That way it is more easy to process jobs where arguments may include whitespace character or other tokens that influence argument parsing. -0 Same as -1, but as a job and argument termination character a null-character () is expected instead of a new-line character. That way also arguments with new-line character can be processed without escape sequences. -V Print the version number of xjobs and exit. -v <level> Set verbosity of xjobs to level. Valid leves are: 0=silent, 1=error, 2=warning, 3=info, 4=debug. The default level of verbosity is 3. EXAMPLES
If you have a lot of .zip files that you want to extract, then use xjobs like this: $ ls -1 *.zip | xjobs unzip If you want to do the same without getting the output of each unzip task on your terminal, then try this: $ ls -1 *.zip | xjobs -n unzip To gzip all *.bak files in a given directory hierarchy, use it the following way: $ find . -name '*.bak' | xjobs gzip To generate index files for a set of *.jar files, you can use the redirection feature of xjobs, and do the following: $ ls -1 *.jar | sed 's/(.*)/1 > 1.idx/' | xjobs jar tf If you also want to capture the error output, than use >& instead of >. You can also use it to execute several different commands. Therefore, write a script file that contains every job you want to execute and pass it to xjobs with the option -s: $ cat - > script unzip my.zip tar xf my.tar lame --silent my.wav my.mp3 crypt notsecret < mydata > secretfile ^D $ xjobs -s script To be able to queue up jobs from multiple sources with xjobs, use a named pipe and pass it explicitly as input script. Then write the jobs to the named pipe: $ mkfifo /var/run/my_named_pipe $ xjobs -s /var/run/my_named_pipe & $ echo unzip 1.zip >> /var/run/my_named_pipe $ echo tar cf /backup/myhome.tar /home/me >> /var/run/my_named_pipe ENVIRONMENT VARIABLES
PATH Determines the location of command. AUTHORS
Thomas Maier-Komor <thomas@maier-komor.de> Donations via PayPal are welcome! HOMEPAGE
http://www.maier-komor.de/xjobs.html LICENSE
GNU General Public License Version 2 SEE ALSO
xargs(1) Thomas Maier-Komor 20100915 xjobs(1)
All times are GMT -4. The time now is 12:40 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy