Visit Our UNIX and Linux User Community

Monitoring processes in parallel and process log file after process exits

Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Monitoring processes in parallel and process log file after process exits
# 1  
Old 09-12-2017
Monitoring processes in parallel and process log file after process exits

I am writing a script to kick off a process to gather logs on multiple nodes in parallel using "&". These processes create individual log files. Which I would like to filter and convert in CSV format after they are complete. I am facing following issues:
1. Monitor all Processes parallelly. whichever process completes first I would like to convert o/p log file of corresponding process to csv format. Sometimes process might take more than 30 mins to complete. Problem with my code is it will go serially converting files and not any process which completes or releases the file first. Please help

I have following code
#!/bin/sh -x

exec 2>&1
for IPADD in `cat $infile|awk '{print $1}' |tr -d '\015'`
nohup ${PWD}/ $IPADD > ${OPFILE} &
echo -e "$DPID $OPFILE" >> pid_pfile.txt

##### Loop to parse and rename the files after data collection is complete.
PALIVE=`ps cax | grep $DPID | grep -o '^[ ]*[0-9]*'`
if [ -z $PALIVE ];then
HNAME=`grep -i hostname |awk '{print $NF}'`
awk '/name/,/exit/'  $OPFILE |head -n -1 |awk '{print $1,$2,$3,$4,$5}'> ${HNAME}.txt

PCOUNT=`pgrep |wc -l`

while [ $PCOUNT -gt 0 ];do

      for DPID in `pgrep |awk '{print $1}'`

       PALIVE=`ps -p $DPID --no-headers | wc -l`
          if [ $PALIVE == 0 ];then
            wait $DPID
            OPFILE=`grep $DPID pid_pfile.txt|awk '{print $2}'`
           # sed "/$DPID/d" pid_pfile.txt > pid_pfile.txt
          sleep 120

PCOUNT=`pgrep "junk" |wc -l`
rm -f pid_pfile.txt

---------- Post updated at 11:42 PM ---------- Previous update was at 11:38 PM ----------

$infile has list of IP addresses of nodes.
# 2  
Old 09-12-2017
Stupid question, but why not something like
nohup ${PWD}/ $IPADD | awk '/name/,/exit/{print $1,$2,$3,$4,$5}'|head -n -1> ${OPFILE} &

The output files would then be post-processed on the fly.

I'm not up on job control, but by using bash or ksh I suspect you could change the DPID line to

and at the end of the loop wait for all processes:
wait $DPID

# 3  
Old 09-12-2017
I think that this might be the wait and following commands. They are all serial, i.e. you wait and then do something for each PID in sequence.

You might need to do something more like this:-

while read PID OPFILE
    (wait $PID ; process_OPFILE $OPFILE ) &
done < pid_pfile.txt              # Read each line of the file in a loop

wait    # Make sure all have completed before continuing to any end of script process.

This would create a watcher for each main process that you want to process the report on afterwards. You can see that I read the two variables from the file at the same time at the start of the loop and you can then use them as you wish. This is just an example. Writing a function for process_OPFILE to keep the code neater if you have a lot to do.

Additionally, I've seen you try a sed of a file writing the output to the same file. Whilst probably no longer required, this will fail because the redirector > will open and empty the file before you have had chance to read it. If you want to update a file like this, try:-
sed -i "/$DPID/d" pid_pfile.txt

Does this help you?

# 4  
Old 09-12-2017
Hi Andrew,

Your method works only if there is only file format filtering. I also need to rename the o/p file base on HNAME.
HNAME=`grep -i hostname ${OPFILE} |awk '{print $NF}'`

Not sure if we can incorporate that in the same line.
nohup ${PWD}/ $IPADD | awk '/name/,/exit/{print $1,$2,$3,$4,$5}'|head -n -1> ${OPFILE} &

# 5  
Old 09-12-2017
You can make your script a lot simpler by using more of the shell's own basic features. You can split input on fields in shell without the help of awk, sed, and tr. You can write to the same file 37 times instead of reopening the same files 37 times to append.

Also, why bother with all the pgrep stuff when you made yourself such a nicely formatted list of PID's to read?


infile="$1" ; shift
exec 2>&1

NEWIFS=`printf "\r\n\t "` # Make sure read splits on carriage returns too

        nohup ${PWD}/ $IPADD > ${OPFILE} &
        echo "$DPID $OPFILE"
done < "$infile" > pid_pfile.txt

while read DPID OPFILE
        wait "$DPID"  # Waiting for a specific PID may require bash or ksh
        HNAME=`grep -i hostname |awk '{print $NF}'` # ???? What is this reading from?
        awk '/name/,/exit/'  $OPFILE |head -n -1 |awk '{print $1,$2,$3,$4,$5}'> ${HNAME}.txt
done < pid_pfile.txt

rm -f pid_pfile.txt

Also, if we knew what your data looked like, we could probably streamline that awk / grep / awk / head / awk down into one awk call.

Last edited by Corona688; 09-12-2017 at 05:49 PM.. Reason: Fix IPADDR / IPADD typo
These 2 Users Gave Thanks to Corona688 For This Post:
# 6  
Old 09-12-2017
That is wonderful, Corona!! It works for me. Thanks for your help.

Previous Thread | Next Thread
Test Your Knowledge in Computers #38
Difficulty: Easy
The standard port number for SMTP applications is 25.
True or False?

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Make process parallel

Hi, I have a file which has a list of 200 tables e.g: table.txt I need to do a count for each table and store it in a file. So I did something like this: for TABLE in `cat table.txt` do T_CNT=$(sqlplus -s -l / as sysdba <<EOF set echo off heading off feadback off SELECT count(*) FROM... (1 Reply)
Discussion started by: wahi80
1 Replies

2. UNIX for Dummies Questions & Answers

Running parallel process

i am having 4 process,have to run parallel and not after one by one. Thanks in advance. i (11 Replies)
Discussion started by: sagar_1986
11 Replies

3. Programming

Parallel process in java

Hello; Please Are both threads execute in parallel? Thank you (4 Replies)
Discussion started by: chercheur857
4 Replies

4. Shell Programming and Scripting

Parallel process in Perl

HI All, I have scenerio where I need to call sub modules through for loop for (i=0; i<8000 ;i++) { .. BLOCKA } BLOCKA { .. .. subroutine 1; subroutine 2; } I want this to be run in parallel process1 BLOCKA { (6 Replies)
Discussion started by: gvk25
6 Replies

5. Shell Programming and Scripting

Finding the age of a unix process, killing old processes, killing zombie processes

I had issues with processes locking up. This script checks for processes and kills them if they are older than a certain time. Its uses some functions you'll need to define or remove, like slog() which I use for logging, and is_running() which checks if this script is already running so you can... (0 Replies)
Discussion started by: sukerman
0 Replies

6. Shell Programming and Scripting

Automate Log Monitoring Process

I am a new member of this forum and am also new to unix shell scripting. I joined the forum to seek for help to achieve my task as this forum helps people. here's what i do manually on daily basis 1)Loginto different unix box 2)Ftp the log files (morethan 50 each dir) to windows 3)use text pad... (3 Replies)
Discussion started by: sharugan
3 Replies

7. UNIX and Linux Applications

How can i see if a unix Process Aplication i.e oracle is running in parallel

There is a unix process process in oracle running and i see running by typing ps -fea|grep GE_CLIENTES. The question is How can i see if this process is running in paralel. I dont know with a Unix command or specifically its a comand from Oracle. I kow a Parallel process ia a process that... (1 Reply)
Discussion started by: alexcol
1 Replies

8. Shell Programming and Scripting

process monitoring

hi all, i would like to write the shell script to monitoring the processing, but if i passing the parameter the number of process is incorrect how to slove it? many thx got the correct number of process as following script: ===========================================================... (3 Replies)
Discussion started by: eric_wong_ch
3 Replies

9. Shell Programming and Scripting

split process files in parallel and rejoin

Hi I need to split a huge file into multiple smaller files using split command. After that i need to process each file in the back ground with sql loader .Sql loader is a utlity to load CSV files into oracle . Check the status of each of these sqlloaders and then after sucessfull... (6 Replies)
Discussion started by: xiamin
6 Replies

10. UNIX for Dummies Questions & Answers

killing parallel oracle process

Hi guys: I have a an oracle job which uses 10 parallel hints and would like to killit when it hangs. I want to kill all the processes that have been spawned. what I do right now is get the pid of the scheduler process which initiated theis job and the do a ps -ef| grep 'pid' and trace through... (1 Reply)
Discussion started by: oracle8
1 Replies

Featured Tech Videos