My bet is it could be an environment issue. What shell are you in when you run it from the command line? Are you also specifying the shell at the top of your script? For example, if you were in the korn shell you might have in the first line of your script:
Also I'd make sure your `jobs` command is in your $PATH when the script executes. After your script executes you might also check $? to see what the return value is. Hope that helps.
My bet is it could be an environment issue. What shell are you in when you run it from the command line? Are you also specifying the shell at the top of your script? For example, if you were in the korn shell you might have in the first line of your script:
Also I'd make sure your `jobs` command is in your $PATH when the script executes. After your script executes you might also check $? to see what the return value is. Hope that helps.
There is a section in the Bash man-page, which goes some way to explaining your predicament (the same also holds of other shells which support job control, but man ksh doesn't word it quite like that):
That's to say that jobs won't show you background processes started in your shell from your script (which runs in a different environment from your shell).
Your best bet, if it's practicable, is to run your script in the same environment as the shell which started the jobs.
i.e.
or to pass the jobs PID's into the script, i.e, like
myscript:
I have multiple jobs and each job dependent on other job.
Each Job generates a log and If job completed successfully log file end's with JOB ENDED SUCCESSFULLY message and if it failed then it will end with JOB ENDED with FAILURE.
I need an help how to start.
Attaching the JOB dependency... (3 Replies)
Here is my test script:
#!/bin/sh
result=`jobs`
echo "
Jobs:
"$result
result=`ls`
echo "
LS
"$result
Here is the output:
Jobs:
LS
0 1 2 3 4 5 6 7 gcd initialize.sh #inter_round_clean.sh# inter_round_clean.sh inter_round_clean.sh~ look parallel_first_run.sh... (3 Replies)
Hello,
I am running GNU bash, version 3.2.39(1)-release (x86_64-pc-linux-gnu). I have a specific question pertaining to waiting on jobs run in sub-shells, based on the max number of parallel processes I want to allow, and then wait... (1 Reply)
Good morning!
When I type in the command "jobs" it takes me back to the command prompt?
Any idea why and how I can display all the jobs that are currently running off that host?
Ben (6 Replies)
Hello all, I have a quick question. I work in a computational science laboratory, and we recently got a few mac pros to do molecular optimizations on. However, on our normal supercomputers, there are queue systems, mainly PBS.
Anyway, the macs obviously don't have PBS, but I've read about... (0 Replies)
Hi Friends,
I am using a dsjob command in a unix script to invoke DataStage jobs.
DataStage server jobs (version 7.5.2)
The command looks like thisL:
$DSBinPath/dsjob -server :$SERVER_PORTID -run -mode NORMAL -jobstatus -param INPUT_GCDB_DIR=$InputFilePath -param... (0 Replies)
i need to execute 5 jobs at a time in background and need to get the exit status of all the jobs i wrote small script below , i'm not sure this is right way to do it.any ideas please help.
$cat run_job.ksh
#!/usr/bin/ksh
####################################
typeset -u SCHEMA_NAME=$1
... (1 Reply)
Hey guys, I have an up and coming interview (tomorrow) and during the discussion via phone I was asked if I was familiar with "monitoring jobs in Linux/UNIX using the command line." Now, I currently work in the MS world and I am underneath the NOC hear at my company so I have had no reason to do... (2 Replies)