Automate the process of running jobs on several directories consecutively


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Automate the process of running jobs on several directories consecutively
# 1  
Old 09-17-2011
Automate the process of running jobs on several directories consecutively

Hi, I have about 10 directories in which I run the same code. But this code is only run in a directory when the computation in the previous directory was successful (explained below).

My directories are named as :

VF_50, VF_100, VF_150, VF_200.............

As you can see the number after _ keeps increasing. The lowest (in this case VF_50) is the first directory I want to run the code. I need a shell script that can determine the order in which I would run the code.

So I first run the code in VF_50. If it's successful (by opening a file output.dat created from the code and searching for the word "success") then I move to directory VF_100.

Is there anyway to automate this?

Many thanks.
# 2  
Old 09-17-2011
Yes you can achieve such logic programmaticly , Just wait for some one to post here .
# 3  
Old 09-17-2011
I've made some progress on my own so far. The only problem is that I assume that each file is incremented by 50 (VF_50, VF_100, VF_150 etc.) but this may not always be the case. I'd like to make it more general. This is how it looks now.

Code:
i=50
for dir in VF_*
do
        if [ -d "$VF_$i" ]; then
                cd VF_$i
                pwd
                     cp ../VF_$i_old/restrt.bin .
                     ./oppre_cT_T < inp01_$i > out.dat &
        wait $!
        else
                echo Directory VF_$i does not exist.
        fi

        let i=$i+50
        cd ../
done

# 4  
Old 09-17-2011
This should get you started:

Code:
#!/usr/bin/env ksh

set -e          # force exit if subshell fails
ls -d vf_* | while read dir
do
    (   # run in subshell to preserve original working directory
        if ! cd $dir
        then
            echo "abort: cannot switch to dir: $dir"
            exit 1
        fi

        echo "running command in: $dir"     # verbose for testing if needed
        #your-command-goes-here

        if  grep -q "success" output.dat
        then
            echo "successfully executed command in: $dir"
        else
            echo "abort: command failed in dir: $dir"
            exit 1
        fi
    )
done

exit

The script assumes that you need to switch into each directory before running your command. It aborts if the cd command fails, or if after running your command there isn't a success in output.dat. It also assumes that all directories are named vf_* and exist in the same directory.

I ran a quick test using Kshell; it probably works with bash, but I didn't test it under bash.

Hope this helps

---------- Post updated at 11:22 ---------- Previous update was at 11:17 ----------

I see we overlapped posts! You are making good progress.

Quote:
Originally Posted by lost.identity
The only problem is that I assume that each file is incremented by 50 (VF_50, VF_100, VF_150 etc.) but this may not always be the case. I'd like to make it more general.
You don't need to set i and increment it... use the variable you set in the loop:

Code:
for dir in vf_*
do
   cd $dir
   pwd
   cd ../
done

---------- Post updated at 11:25 ---------- Previous update was at 11:22 ----------

One more suggestion. You don't need to run your command asynchronously (you're not doing anything else while it runs), so drop the trailing ampersand (&) and the wait and it will make your script easier to read/maintain.

Last edited by agama; 09-17-2011 at 12:23 PM.. Reason: typo
# 5  
Old 09-17-2011
Many thanks for the reply.

Sorry I wasn't very clear when I first explained it. Actually I need to to see if the command runs successfully in a particular directory before moving to the next directory (I need to copy data from the previous directory with a successful run to the current directory).

I don't know how I could do this without having a counter as the order in which I run the commands depends on the directory number (i.e. the number after "_ " in VF_*)

I need to figure out how this would work if the increment is not constant.

This is how it looks now (the word success appears several times in my file but it appears on the 9th line if the computation was finally successful, didn't know how to write this any neater).

Code:
#!/bin/bash

i=50
c=0
for dir in VF_*
do
        let c=$c+1
        if [ -d "$VF_$i" ]; then
                cd VF_$i
                pwd
                if [ $c -ne 1 ]
                        cp ../VF_$i_old/restrt.bin .      #copying the file from previous successful run
                     ./oppre_cT_T < inp01_$i > out.dat &
                else
                     ./oppre_cT_T < inp01_$i > out.dat &
                fi
        wait $!
        else
                echo Directory VF_$i does not exist.
        fi

        let i_old=$i
        let i=$i+50

        cmd=$(cat ../../phi_0_6_T/VF_100/out.dat | tail -9 | grep -ci "SUCCESS")

        if [ $cmd -ne 0 ]; then
                echo "Success"
        cd ../
        else
                echo "Unsuccessful"
                exit 1
        fi

done

exit

# 6  
Old 09-17-2011
Sorry. I had assumed that your directory names would sort in numerical order, but I now see that they wont. This is the way I'd modify your script to eliminate incrementing i. I also question the fact that you are searching the same file for 'success' and not the file that was just created.

Code:
#!/usr/bin/env bash

i_old=""                    # last directory number to copy restart file from
ls  -d VF_* |sed 's/.*_//'|sort -n| while read i        # for each directory, sort by the number setting i
do
        if [[ -d "$VF_$i" ]]; then
            if ! cd VF_$i
            then
                echo "abort: unable to switch to VF_$i"     # rare, but better to test for failure than to assume
                exit 1                                      # that it will always work
            fi

            pwd
            if [[ -n $i_old ]]                      # not the first directory
            then
                cp ../VF_$i_old/restrt.bin .      #copying the file from previous successful run
            fi

            ./oppre_cT_T < inp01_$i > out.dat 
            i_old=$i    

            #cmd=$(cat ../../phi_0_6_T/VF_100/out.dat | tail -9 | grep -ci "SUCCESS")
            # um, the previous seems wrong as you are always checking the same out.dat file. maybe this:
            n=$( tail -9 out.dat | grep -ci "success" )

            if (( $n > 0 )); then
                echo "Success ($i)"
            else
                echo "Unsuccessful ($i)"
                exit 1
            fi

            cd ../
        else
                echo "Directory VF_$i does not exist."
        fi
done

exit

Also, you were capturing the old directory number even if it wasn't a directory. I moved this up such that you capture i into old_i, and check for success, only when there was a directory.
This User Gave Thanks to agama For This Post:
# 7  
Old 09-18-2011
That' great! Thanks so much.

It was my mistake when I pasted the script earlier, it should the file in the current directory for success.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Script to automate recovery process

Hello All! First post... I am working on a script that is used to recover a crashed drive from an rsync backup. I'm down to the place where I need to create all of the directories in /mnt where I will then mount each of the volumes and begin the restore process to each volume... I have... (3 Replies)
Discussion started by: RogerBaran
3 Replies

2. UNIX for Dummies Questions & Answers

Automate sftp process using script

Hi, guys, I am trying to automate a sftp process using "expect" method (since the key authentication method is disabled in my company network, there is no helping it). In order to try, I type in the command manually: sftp @ > << EOF >cd >ls -l >EOF >Connecting to @servername password: ... (3 Replies)
Discussion started by: warmboy610
3 Replies

3. UNIX for Dummies Questions & Answers

To automate a process

CAN ANYONE HELP TO SOLVE i wann write a script to automate a process .i.e, to search files in the FTP server and and if files are there and we hav to bring that files to our system. After copying the files in our system we have to upload the data in the tables. I have scripts to load the... (2 Replies)
Discussion started by: nani1984
2 Replies

4. UNIX for Advanced & Expert Users

TO Automate the FTP process

Plzz Some One Help in this matter I have scripts to load the data into tables and to copy files from ftp to our system. we use to run the scripts every day.... we hav the files in the FTP server and we hav to bring the files to our system and we hav to load the data into the tables. We... (0 Replies)
Discussion started by: nani1984
0 Replies

5. Shell Programming and Scripting

To automate a process

CAN ANYONE HELP TO SOLVE i wann write a script to automate a process .i.e, to search files in the FTP server and and if files are there and we hav to bring that files to our system. After copying the files in our system we have to upload the data in the tables. I have scripts to load the... (0 Replies)
Discussion started by: nani1984
0 Replies

6. Shell Programming and Scripting

Waiting for an arbitrary background process (limiting number of jobs running)

Hi, I'm trying to write a script to decompress a directory full of files. The decompression commands can run in the background, so that many can run at once. But I want to limit the number running at any one time, so that I don't overload the machine. Something like this: n=0 for i in *.gz... (15 Replies)
Discussion started by: p.f.moore
15 Replies

7. Shell Programming and Scripting

Automate Log Monitoring Process

I am a new member of this forum and am also new to unix shell scripting. I joined the forum to seek for help to achieve my task as this forum helps people. here's what i do manually on daily basis 1)Loginto different unix box 2)Ftp the log files (morethan 50 each dir) to windows 3)use text pad... (3 Replies)
Discussion started by: sharugan
3 Replies

8. Shell Programming and Scripting

PLEASE PLEASE Help - Automate text manipulation process

I am trying to automate a process (on windows OS) of manipulating a file by changing its column delimiters. I have a perl script ready but the problem i think is with the format of the file itself. code: #! usr/bin/perl -w foreach (<STDIN>) { #copy this line: my $line = $_; ... (1 Reply)
Discussion started by: salemh
1 Replies

9. Shell Programming and Scripting

How to automate an FTP process?

Hello script experts, I am newbie to shell script. But I have to write a shell script (ASAP) where I need to ftp a file on daily basis to a remote server, and send an email with final status. I I should have a properties file with hostname, Userid, and pwd. And a shall script file should read... (1 Reply)
Discussion started by: ksak
1 Replies

10. Shell Programming and Scripting

running files consecutively with one command

i want to run few c object files one after another in one command. Can i write a acript for that. I'm using Sun Solaris. for example ./prog < input1 >output1 & ./prog <input2 >output2 & i want the first to finish before starting the nest one and run them in the back ground thanks.... (5 Replies)
Discussion started by: narom
5 Replies
Login or Register to Ask a Question