Quote:
Originally Posted by Amruta Pitkar
Hi MatrixMadhan,
I am not sure how to code this exactly....
Also..from what I read...do I have to stop the background processes explicitly..
What if there are some problems with the data parsing...how will the error.log will be created...
Can u explain more, guide for some sample scripts ?
There is no need to stop your background process unless they is a situation to be done so!
would something like this be of help!
split the files ...100k to 10 ( 10k s )
now the process that was used to run against the 100k sample, should be used against each of the smaller chunks
i=1
while[ $i -le 10 ]
do
/somedir/process chunk$i & //Make that a background process
i=$(($i + 1))
done
Now with the above loop, smaller chunks are fed to the individual process and would start processing.
By default background process have a lower priority when compared to the foreground process.
You need to arrive / determine at a threshold value ( more of a bench mark stuff ) where running several processes with smaller chunks is actually not bringing down the performance when compared to running it with a single chunk and just a single process.
Creating the error logs is as the usual way as you had been doing it for the foreground process!