Serializing script Failing for more commands


 
Thread Tools Search this Thread
Top Forums UNIX for Advanced & Expert Users Serializing script Failing for more commands
# 1  
Old 03-28-2010
Serializing script Failing for more commands

I have a reqirement to serialise various rsh scripts that hit my server from an external scheduler. No matter how many scripts come via rsh, only one should execute at a time and others should wait.

I have made the scheduler make a request to my shell script with the command to be run as a parameter. This shell script will be responsible to queue the commands and execute one after the other.

The following is the code that I have written. This runs just fine for a few jobs, but as the number of jobs getting queued increases, the script fails. I am running on Ubuntu 8.4 and ksh.

Can anyone please tell me if there is anything that is obviously wrong? I know my request seems like a review of my code, but I would be greatful is anyone can share any simiar code that I could readily used.

I have read another similar post where various alternatives are suggested, but those will not work for me.

Code:
#!/usr/bin/ksh

pid_line=$$_$1
queue_file=/tmp/queue_file

# sleep for no reason for a random (5) seconds
sleep $(echo $$%5|bc)

# make your entry into queue at the end
 echo $pid_line >> $queue_file

# while you are not the top most job loop
 while [[ $(grep -n $pid_line $queue_file | cut -d: -f1 ) -ne 1 ]]
 do

# if by any rare chance someone removed you from the queue, join back
 if [[ $(grep $pid_line $queue_file | wc -l) -eq 0 ]]
 then
     echo $pid_line >> $queue_file
 fi

# if the current top of the queue job is killed or terminated before it could remove itself, remove it
 curr_pid_line=$(head -1 $queue_file)
 if [ $(ps -ef | grep $(echo $curr_pid_line|cut -d_ -f1) | grep -v grep  | wc -l ) -eq 0 ]
 then
    sleep 1
    grep -Ev $curr_pid_line $queue_file | cat > $queue_file
 else
# if someother job is running, wait for sometime before attempting
  sleep 20
 fi
 done

# you are the first job now, run yourself now
 $1
 result=$?

# remove yourself from the queue
 grep -vE $pid_line $queue_file | cat > $queue_file

# exit with the exit code of the command that you ran
 exit $result


Last edited by nkamatam; 03-28-2010 at 07:10 PM..
# 2  
Old 03-28-2010
Your script isn't protected against simultaneous execution of critical areas. You need atomic lock operations to make sure one instance isn't going to modify your queue_file while it is processed by another. Scripting is probably not the best approach to do that kind of thing. Moreover, the "grep something file | cat > file" seems bogus to me. "file" will be erased before being read. This is at least what it does with ksh on Solaris.
# 3  
Old 03-29-2010
That command works with ksh - I am hoping it will work on Solaris also.

Code:
$ cp /etc/passwd testfile
$ grep -E abcd testfile | cat > testfile
$ cat testfile
abcd:x:100:100:ABCD,,,:/home/abcd:/bin/bash

I think the critical sections are those that are removing and adding lines to the queue.
That is,
Code:
echo $pid_line >> $queue_file

and
Code:
grep -vE $pid_line $queue_file | cat > $queue_file

I agree that inspite of my logic to re-add accidentally removed jobs back to the queue, there could be some problems.

But, I think this is what I have to go ahead with given the fact that there will never be more than 20 jobs at any given time - and anything more than this will be an overkill for the functional requirement that I have.

I am thinking writing some C program with named pipes when I have more time and money.

I am surprised that there is no easy way of doing this on Unix.

edit by bakunin: i provided the code-tags you surely have just forgotten - no problem, but please bring them with you the next time. Thank you.

Last edited by bakunin; 03-29-2010 at 05:07 PM..
# 4  
Old 03-29-2010
Quote:
Originally Posted by nkamatam
That command works with ksh - I am hoping it will work on Solaris also.

$ cp /etc/passwd testfile
$ grep -E abcd testfile | cat > testfile
$ cat testfile
abcd:x:100:100:ABCD,,,:/home/abcd:/bin/bash
This command works "by accident". Its behavior is undefined. testfile might be cleared before being read depending on unpredictable factors.
Quote:
I am surprised that there is no easy way of doing this on Unix.
What makes you believe there is not ? Unix bundles a job scheduler and "at -q xx" with xx as a custom queue with simultaneous jobs forbidden looks suited for this task although I didn't really tested that approach.
# 5  
Old 03-29-2010
To be honest i think that the whole design is flawed - not only the pipeline into itself, as jiliagre has already pointed out (and rightfully so, i might add).

A better way to do this would be to use the filesystems ablilty to sort by date and instead of maintaining a file maintain a directory with (dated) filestamps to manage the jobs. Here is a sketch of a solution i think might work:

Code:
#! /bin/ksh

typeset workdir=/path/to/dir
typeset myself=$$
typeset action="$1"    # the job gets passed from outside

touch ${workdir}/job.${myself}     # enqueue the job

while [ "$(ls -rt ${workdir} | head -1)" != "job.${myself}" ] ; do
     sleep 5     # wait until our job is the oldest in the directory,
                    # which means we are the first in queue
done

# now that we are the first one do the action:

$action   # do whatever the job is supposed to do

rm ${workdir}/job.${myself}  # remove ourselves from the queue

exit 0

I hope this helps.

bakunin
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Crontab script failing

Hello, A crontab script is failiing everyday but when we execute manually it runs fine Below is the script scheduled: 00 23 * * * sh /db2backup/scripts/db2_hot_backup.ksh TRAVFF > /dev/null 2>&1 Error: cat TRAVFF_online_04022014_2300.log Started : Wed Apr 2 23:00:00 EDT 2014... (2 Replies)
Discussion started by: Vishal_dba
2 Replies

2. UNIX for Dummies Questions & Answers

Script test failing

Testing some old script developed by different user. #!/usr/bin/sh case "$0" in */*) cmd="$0";; *) cmd=`which "$0"`;; esac dir=`dirname "$cmd"` node="$dir/." echo $node below two simple tests are failing, I am not seeing any Control+M characters in the script file and I am not able... (4 Replies)
Discussion started by: srimitta
4 Replies

3. Shell Programming and Scripting

SCP Failing - In Script

I create a file that may contain several full path name files on a remote Linux that are to be copied. This file, called AWKOUTPUT is created from another script. It contains: X/picture1.png The script is very simple ------------------------------------------- REMOTEDIR="/var/CC/Stuff"... (4 Replies)
Discussion started by: mohrsville12
4 Replies

4. Shell Programming and Scripting

Script running for one iteration and failing the second one

Hi, I am using Tclexpect to automation testing on switches and using regexp and the buffer outputs the program is running well for one iteration and failing the second one... can anyone please guide me what is the mistake i am making? Thanks Here is the small version of the program, while... (2 Replies)
Discussion started by: roh_20022002
2 Replies

5. Shell Programming and Scripting

Expect script help needed- script failing if router unavailable

Hey all. Sometimes I'm tasked to change some router configs for the entire network (over 3,000 Cisco routers). Most of the time its a global config parameter so its done with a loop and an IP list as its the same configuration change for all routers. This is working OK. However, sometimes an... (3 Replies)
Discussion started by: mrkz1974
3 Replies

6. Shell Programming and Scripting

Script failing on Solaris 10 and working on 8

I have a script and code is like this .. if ]; then it's compiling about @ . what's the wrong in Solaris 10 with this? Thanks (1 Reply)
Discussion started by: talashil
1 Replies

7. Shell Programming and Scripting

Unix commands failing inside the shell script

When my script deals with large input files like 22Gb or 18 GB the basic commands like sort or join fails when run from inside the shell scripts. Can there be any specific reason for this? For e.g. sort -u -t "," -k1,1 a.csv > a.csv.uniq" sort -u -t "," -k1,1 b.csv > b.csv.uniq" The... (3 Replies)
Discussion started by: esha
3 Replies

8. UNIX for Dummies Questions & Answers

failing a unix script

i am basically DWH professional. I want to write a script such that whenver the file size is greather than 0 the script fails plz help me in this (1 Reply)
Discussion started by: er_zeeshan05
1 Replies

9. Programming

serializing logging output mult. proc. inst deamon

Hello, i have an interresting topic today C++ on solaris. lgpl stuff applicable. My program is a deamon process wich takes input from network, then processes the data, and outputs reformatted to network. We're generating a lot of logging output. the logging is absolutely unbuffered at the... (3 Replies)
Discussion started by: heck
3 Replies

10. UNIX for Advanced & Expert Users

UNIX script failing ....

Hi folks, I have written down a UNIX script which actually FTP the file to other server. What is happening now , this script is not working only for 1 server , as it is working for 32 different FTP server . In this particular server , we are getting message “FTp:550 access denied”... (1 Reply)
Discussion started by: khan1978
1 Replies
Login or Register to Ask a Question