Sponsored Content
Top Forums Shell Programming and Scripting Automate the process of running jobs on several directories consecutively Post 302556237 by agama on Saturday 17th of September 2011 11:25:16 AM
Old 09-17-2011
This should get you started:

Code:
#!/usr/bin/env ksh

set -e          # force exit if subshell fails
ls -d vf_* | while read dir
do
    (   # run in subshell to preserve original working directory
        if ! cd $dir
        then
            echo "abort: cannot switch to dir: $dir"
            exit 1
        fi

        echo "running command in: $dir"     # verbose for testing if needed
        #your-command-goes-here

        if  grep -q "success" output.dat
        then
            echo "successfully executed command in: $dir"
        else
            echo "abort: command failed in dir: $dir"
            exit 1
        fi
    )
done

exit

The script assumes that you need to switch into each directory before running your command. It aborts if the cd command fails, or if after running your command there isn't a success in output.dat. It also assumes that all directories are named vf_* and exist in the same directory.

I ran a quick test using Kshell; it probably works with bash, but I didn't test it under bash.

Hope this helps

---------- Post updated at 11:22 ---------- Previous update was at 11:17 ----------

I see we overlapped posts! You are making good progress.

Quote:
Originally Posted by lost.identity
The only problem is that I assume that each file is incremented by 50 (VF_50, VF_100, VF_150 etc.) but this may not always be the case. I'd like to make it more general.
You don't need to set i and increment it... use the variable you set in the loop:

Code:
for dir in vf_*
do
   cd $dir
   pwd
   cd ../
done

---------- Post updated at 11:25 ---------- Previous update was at 11:22 ----------

One more suggestion. You don't need to run your command asynchronously (you're not doing anything else while it runs), so drop the trailing ampersand (&) and the wait and it will make your script easier to read/maintain.

Last edited by agama; 09-17-2011 at 12:23 PM.. Reason: typo
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

running files consecutively with one command

i want to run few c object files one after another in one command. Can i write a acript for that. I'm using Sun Solaris. for example ./prog < input1 >output1 & ./prog <input2 >output2 & i want the first to finish before starting the nest one and run them in the back ground thanks.... (5 Replies)
Discussion started by: narom
5 Replies

2. Shell Programming and Scripting

How to automate an FTP process?

Hello script experts, I am newbie to shell script. But I have to write a shell script (ASAP) where I need to ftp a file on daily basis to a remote server, and send an email with final status. I I should have a properties file with hostname, Userid, and pwd. And a shall script file should read... (1 Reply)
Discussion started by: ksak
1 Replies

3. Shell Programming and Scripting

PLEASE PLEASE Help - Automate text manipulation process

I am trying to automate a process (on windows OS) of manipulating a file by changing its column delimiters. I have a perl script ready but the problem i think is with the format of the file itself. code: #! usr/bin/perl -w foreach (<STDIN>) { #copy this line: my $line = $_; ... (1 Reply)
Discussion started by: salemh
1 Replies

4. Shell Programming and Scripting

Automate Log Monitoring Process

I am a new member of this forum and am also new to unix shell scripting. I joined the forum to seek for help to achieve my task as this forum helps people. here's what i do manually on daily basis 1)Loginto different unix box 2)Ftp the log files (morethan 50 each dir) to windows 3)use text pad... (3 Replies)
Discussion started by: sharugan
3 Replies

5. Shell Programming and Scripting

Waiting for an arbitrary background process (limiting number of jobs running)

Hi, I'm trying to write a script to decompress a directory full of files. The decompression commands can run in the background, so that many can run at once. But I want to limit the number running at any one time, so that I don't overload the machine. Something like this: n=0 for i in *.gz... (15 Replies)
Discussion started by: p.f.moore
15 Replies

6. Shell Programming and Scripting

To automate a process

CAN ANYONE HELP TO SOLVE i wann write a script to automate a process .i.e, to search files in the FTP server and and if files are there and we hav to bring that files to our system. After copying the files in our system we have to upload the data in the tables. I have scripts to load the... (0 Replies)
Discussion started by: nani1984
0 Replies

7. UNIX for Advanced & Expert Users

TO Automate the FTP process

Plzz Some One Help in this matter I have scripts to load the data into tables and to copy files from ftp to our system. we use to run the scripts every day.... we hav the files in the FTP server and we hav to bring the files to our system and we hav to load the data into the tables. We... (0 Replies)
Discussion started by: nani1984
0 Replies

8. UNIX for Dummies Questions & Answers

To automate a process

CAN ANYONE HELP TO SOLVE i wann write a script to automate a process .i.e, to search files in the FTP server and and if files are there and we hav to bring that files to our system. After copying the files in our system we have to upload the data in the tables. I have scripts to load the... (2 Replies)
Discussion started by: nani1984
2 Replies

9. UNIX for Dummies Questions & Answers

Automate sftp process using script

Hi, guys, I am trying to automate a sftp process using "expect" method (since the key authentication method is disabled in my company network, there is no helping it). In order to try, I type in the command manually: sftp @ > << EOF >cd >ls -l >EOF >Connecting to @servername password: ... (3 Replies)
Discussion started by: warmboy610
3 Replies

10. Shell Programming and Scripting

Script to automate recovery process

Hello All! First post... I am working on a script that is used to recover a crashed drive from an rsync backup. I'm down to the place where I need to create all of the directories in /mnt where I will then mount each of the volumes and begin the restore process to each volume... I have... (3 Replies)
Discussion started by: RogerBaran
3 Replies
CRONTABS(4)						      Crontabs users' Manual						       CRONTABS(4)

NAME
crontabs - configuration and scripts for running periodical jobs SYNOPSIS
run-parts [--list|--test]<directory> DESCRIPTION
Crontabs is a historical name for the run-parts script and the system crontab. The run-parts script runs all executables in the specified directory. Run-parts runs all executables in the specified directory. The execution of files can be allowed or denied by creating file jobs.allow or jobs.deny which worked similar as other allow/deny config files. The file must be created in the specified directory. --list print names of all files (not limited to executables), but don't run them. This option can't be used with test option. --test print names of files, which would be run. Randomization of jobs can be configured in the /etc/sysconfig/run-parts file. To enable randomization of jobs, set the RANDOMIZE parameter to 1 and set the RANDOM parameter to an integer which determines a random seed. Additionally, you may configure the RAN- DOMTIME parameter (again, by specifying an integer) to provide an additional level of randomization. Jobs are not randomized when the RANDOM and RANDOMTIME parameters are set to 0. Values in these two parameters must be set to 1 or larger to provide a good enough randomization. Randomization of cron jobs can be useful for shared networks, where multiple cron jobs executed at once can cause spikes in traffic, especially during daily jobs. With randomized jobs, the workload is evenly distributed throughout the day. EXAMPLE OF CONFIGURATION FILE
RANDOMIZE=1 RANDOM=4 RANDOMTIME=8 Historically the crontab file contained configuration which called run-parts on files in cron.{daily,weekly,monthly} directories. These jobs are now run indirectly through anacron to prevent conflicts between cron and anacron. That means the anacron package has to be installed if the jobs in these directories should be running. Refer to the anacron(8) how to limit the time of day of the job execution. EXAMPLE
/etc/cron.daily/jobs.deny could contain for example 0logwatch which forbid execution of this script. SEE ALSO
anacron(8), crontab(5) Marcela Malaova 2012-08-29 CRONTABS(4)
All times are GMT -4. The time now is 09:31 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy