Hello,
I am an absolute newbie and whatever I've written in the shell script (below) has all been built with generous help from googling the net and this forum. Please forgive any schoolboy mistakes.
From this I have to find out the resp time (start time - end time) for each request id for the SvcName CO. I have written this shell script (not finished though) however the performance is very slow (takes a minute to process a 100 line file). can you please point me in the right direction to improve the performance? Although not the exact subject of this post, if any pointers can be given to calculate the difference between start and end time, it will be quite helpful.
Code:
#!/bin/bash
#script to resptime timings for CO call from logfile
#////////////////////////////////////
#if no command line args
if [ $# -ne 1 ]
then
echo 1>&2"Oops......Usage is wrong. $0 <tgtsrchfile>"
exit 2
fi
#assigning command line params to variables
srchFN=$1
#remove resptime.log if already present
checkfile="./resptime.log"
tempfile="./temp.log"
tempfile1="./temp1.log"
if [ -e $checkfile ];then
rm -r $checkfile
fi
if [ -e $tempfile ];then
rm -r $tempfile
fi
if [ -e $tempfile1 ];then
rm -r $tempfile1
fi
#if keywordfile not present
if [ ! -r $srchFN ]; then
echo Target search file $srchFN not present
exit 2
fi
#grep for request id
grep 'Start.*CO' $srchFN | awk -F "RequestId: " '{print $2}'>temp.log
#for each request id get starttime and end time and print into temp file
cat temp.log | while read line; do
#if string is empty
if [ -n $line ];then
sttime=`grep Start.*CO.*$line $srchFN | awk -F "," '{print $1}'|awk -F " " '{print $2}'`
endtime=`grep End.*CO.*$line $srchFN | awk -F "," '{print $1}'|awk -F " " '{print $2}'`
if [ -n "$sttime" -o -n "$endtime" ];then
echo $line,$sttime,$endtime>>temp1.log
fi
fi
done;
#/////////////////////////////////////
Thanks in advance
Last edited by jim mcnamara; 04-08-2009 at 10:51 AM..
Reason: code tags
it looks like the problem is you're rereading the whole $srchFN for every line
of your temp.log twice through those grep commands.
Try to rethink things so that you know what you're looking for while only
reading the file once.
My input file has one line for the start of a request and one for the end of the request. And there are hundreds of unique requests. So I am struggling to think of how else I can get the start and end time with one grep, for each req id...
something like this? substitute 'd' for your log file:
Code:
cat d |
while read line ; do
#----------------------------------------------------------------------#
# Start time and service name. #
#----------------------------------------------------------------------#
if [[ $line = *Service,Start* ]] ; then
echo $line |
sed -e 's/^.*SvcName: //' -e 's/ - .*$//' |
read service_name
echo $line |
sed -e 's/,INFO.*$//' |
read start_time
fi
#----------------------------------------------------------------------#
# End time. #
#----------------------------------------------------------------------#
if [[ $line = *Service,End*SvcName*$svcname*EndUser* ]] ; then
echo $line |
sed -e 's/,INFO.*$//' |
read end_time
echo start-time $start_time end-time $end_time service-name $service_name
fi
done
I have identical M5000 machines that are needing to transfer very large amounts of data between them. These are fully loaded machines, and I've already checked IO, memory usage, etc... I get poor network performance even when the machines are idle or copying via loopback. The 10 GB NICs are... (7 Replies)
There is a big problem with the server (VPS based on OpenVZ, CentOS 5, 3GB RAM). The problem is the following. The first 15-20 minutes after starting the server is operating normally, the load average is less than or about 1.0, but then begins to increase sharply% wa, then hovers around 95-99%.... (2 Replies)
Please, I need help tuning my script. It works but it's too slow.
The code reads an acivity log file with 50.000 - 100.000 lines and filters error messages from it. The data in the actlog file look similar to this:
02/08/2011 00:25:01,ANR2034E QUERY MOUNT: No match found using this criteria.... (5 Replies)
hi guys
We are seeing weird issues on my Linux Suse 10, it has lotus 8.5
and 1 filesystem for OS and another for Lotus Database.
the issue is when the Lotus service starts wait on top is very high about 25% percent and in general CPU usage is very high
we found that when this happens if we... (0 Replies)
Hi all
We have got issues with copying a 2.6 GB file from one folder to another folder.
Well, this is not the first issue we are having on the box currently, i will try to explain everything we have done from the past 2 days.
We got a message 2 days back saying that our Production is 98%... (3 Replies)
We have an egrep search in a while loop.
egrep -w "$key" ${PICKUP_DIR}/new_update >> ${PICKUP_DIR}/update_record_new
${PICKUP_DIR}/new_update is 210 MB file
In each iteration, the egrep on an average takes around 50-60 seconds to search. Ther'es nothing significant in the loop other... (7 Replies)
am relatively new to Shell scripting.
I have written a script for parsing a big file. The logic is:
Apart from lot of other useless stuffs, there are many occurances of <abc> and corresponding </abc> tags. (All of them are properly closed)
My requirement is to find a particular tag (say... (3 Replies)
Discussion started by: gurpreet470
3 Replies
9. Post Here to Contact Site Administrators and Moderators