Monitoring processes in parallel and process log file after process exits
I am writing a script to kick off a process to gather logs on multiple nodes in parallel using "&". These processes create individual log files. Which I would like to filter and convert in CSV format after they are complete. I am facing following issues:
1. Monitor all Processes parallelly. whichever process completes first I would like to convert o/p log file of corresponding process to csv format. Sometimes process might take more than 30 mins to complete. Problem with my code is it will go serially converting files and not any process which completes or releases the file first. Please help
I have following code ---------- Post updated at 11:42 PM ---------- Previous update was at 11:38 PM ----------
Stupid question, but why not something like
The output files would then be post-processed on the fly.
I'm not up on job control, but by using bash or ksh I suspect you could change the DPID line to
and at the end of the loop wait for all processes:
Andrew
I think that this might be the wait and following commands. They are all serial, i.e. you wait and then do something for each PID in sequence.
You might need to do something more like this:-
This would create a watcher for each main process that you want to process the report on afterwards. You can see that I read the two variables from the file at the same time at the start of the loop and you can then use them as you wish. This is just an example. Writing a function for process_OPFILE to keep the code neater if you have a lot to do.
Additionally, I've seen you try a sed of a file writing the output to the same file. Whilst probably no longer required, this will fail because the redirector > will open and empty the file before you have had chance to read it. If you want to update a file like this, try:-
Your method works only if there is only file format filtering. I also need to rename the o/p file base on HNAME.
Not sure if we can incorporate that in the same line.
You can make your script a lot simpler by using more of the shell's own basic features. You can split input on fields in shell without the help of awk, sed, and tr. You can write to the same file 37 times instead of reopening the same files 37 times to append.
Also, why bother with all the pgrep stuff when you made yourself such a nicely formatted list of PID's to read?
Also, if we knew what your data looked like, we could probably streamline that awk / grep / awk / head / awk down into one awk call.
Last edited by Corona688; 09-12-2017 at 05:49 PM..
Reason: Fix IPADDR / IPADD typo
These 2 Users Gave Thanks to Corona688 For This Post:
Hi,
I have a file which has a list of 200 tables e.g: table.txt
I need to do a count for each table and store it in a file.
So I did something like this:
for TABLE in `cat table.txt`
do
T_CNT=$(sqlplus -s -l / as sysdba <<EOF
set echo off heading off feadback off
SELECT count(*)
FROM... (1 Reply)
HI All,
I have scenerio where I need to call sub modules through for loop
for (i=0; i<8000 ;i++)
{
..
BLOCKA
}
BLOCKA
{
..
..
subroutine 1;
subroutine 2;
}
I want this to be run in parallel
process1 BLOCKA
{ (6 Replies)
I had issues with processes locking up. This script checks for processes and kills them if they are older than a certain time.
Its uses some functions you'll need to define or remove, like slog() which I use for logging, and is_running() which checks if this script is already running so you can... (0 Replies)
I am a new member of this forum and am also new to unix shell scripting.
I joined the forum to seek for help to achieve my task as this forum helps people.
here's what i do manually on daily basis
1)Loginto different unix box
2)Ftp the log files (morethan 50 each dir) to windows
3)use text pad... (3 Replies)
There is a unix process process in oracle running and i see running by typing ps -fea|grep GE_CLIENTES.
The question is How can i see if this process is running in paralel. I dont know with a Unix command or specifically its a comand from Oracle.
I kow a Parallel process ia a process that... (1 Reply)
hi all,
i would like to write the shell script to monitoring the processing, but if i passing the parameter the number of process is incorrect
how to slove it? many thx
got the correct number of process as following script:
===========================================================... (3 Replies)
Hi
I need to split a huge file into multiple smaller files using split command.
After that i need to process each file in the back ground with sql loader .Sql loader is a utlity to load CSV files into oracle .
Check the status of each of these sqlloaders and then after sucessfull... (6 Replies)
Hi guys:
I have a an oracle job which uses 10 parallel hints and would like to killit when it hangs. I want to kill all the processes that have been spawned. what I do right now is get the pid of the scheduler process which initiated theis job and the do a ps -ef| grep 'pid' and trace through... (1 Reply)