Issue with awk script parsing log file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Issue with awk script parsing log file
# 15  
Old 06-06-2014
Quote:
Originally Posted by Ariean
It worked thank you, i have some basic questions.

In this code statement (tail -f -n +1 s_GenerateXMLDataFile.log& echo $! > $$.tailpid) | awk ', could you please clarify if my understanding is wrong

1) you enclosed the statement in parantheses because you don't want to print the standard output to the teriminal instead you are piping the output to the awk script?

2) I don't understand what would be the output of echo $! as it is printing nothing when i execute at shell prompt? and why did you use the single ampersand symbol & between tail and echo commands as we would generally use double ampersand symbols && to combine to commands.


In this code statement trap 'kill $(cat $$.tailpid);rm -f $$.tailpid' EXIT

3) what is the difference between EXIT in capitals & exit in smaller case? i see both working
4) why you didn't put exit statement after commands before EXIT signal?

trap 'kill $(cat $$.tailpid);rm -f $$.tailpid; exit' EXIT

Thank you.

------------------------------------------------------------------------

Hello my requirement has changed as UI developer is having difficulties to capture the output of your shell script, now i have to get each value which is returned from this awk script into a shell variable and need to update that value into the table, with latest count as it gets populated in the log file.

For example if this is going to be output from the awk/shell script.
Code:
100320
200640
300960
401280
501600
601920
702240
802560
902880
923096
923096

First i should take 100320 and update the table, and then 200640 untill last value and exit. can you please help how i can do this?
In the commands:
Code:
trap 'kill $(cat $$.tailpid);rm -f $$.tailpid' EXIT
(tail -f -n +1 s_GenerateXMLDataFile.log& echo $! > $$.tailpid) | awk '...'

Code:
tail -f -n +1 s_GenerateXMLDataFile.log&

starts tail asynchronously (AKA in the background). This tail will copy the current contents of the file to standard output and then will copy newly added data to standard output shortly after it is written to the file. This tail will run forever unless it is killed by trying to write to a pipe that has been closed (which won't happen even though awk terminates and closes the read end of the pipe when it sees the last line of the file because no new data will be added to the file after awk terminates) or it is manually terminated by a signal.
Code:
echo $! > $$.tailpid

write the process ID of the last job started in the background (i.e. tail) into a file whose name is the process ID of the shell running this command followed by the string .tailpid.

When awk completes, it will exit, but tail will continue running. The:
Code:
trap 'kill $(cat $$.tailpid);rm -f $$.tailpid' EXIT

says that just before the script exits, it should execute the command:
Code:
kill $(cat $$.tailpid);rm -f $$.tailpid

which will get the process ID of the tail command out of the file created by the echo $?, send that process a SIGTERM signal and remove the file created by the echo.

------------------------------------------------------------------------
If we assume that you named the earlier script
Code:
getcounts

and removed the
Code:
echo 'log file complete'

from it, and then create another script (e.g., readcounts):
Code:
#!/bin/bash
while read count
do	echo 'Do whatever'
	echo 'you want'
	echo "with $count."
done

Then you can use:
Code:
./getcounts | ./readcounts

This User Gave Thanks to Don Cragun For This Post:
# 16  
Old 06-09-2014
Don,

Doing ./getcounts | ./readcounts is causing the second script (readcounts) waiting untill the getcounts script finishes reading the whole log file, but i wanted to capture the output of the awk script or (getcounts) script real time output into a variable or something to insert into the table. Please help.

Thank you.

Last edited by Ariean; 06-09-2014 at 05:20 PM..
# 17  
Old 06-09-2014
Testing the scripts I gave you on Mac OS X with a program that writes one line into s_GenerateXMLDataFile.log every two seconds and the
Code:
getcounts | readcounts

pipeline was printing three lines very quickly after tail forwarded a line to awk that contained the data needed to print another count.

Of course it is possible that tail -f or awk on your system buffers output if the output isn't directed to a terminal although it would be VERY strange if tail did that (buffering contradicts the intent of if). You haven't told us what system you're using.

If getcounts is printing counts in real time, we know that tail -f isn't buffering. So, if awk really is buffering, you could change:
Code:
        if($5=="Requested:")
                print $6

in getcounts to:
Code:
        if($5=="Requested:") {
                print $6 | "./readcounts"
                close("./readcounts") 
        }

to get rid of the effects of awk buffering.
This User Gave Thanks to Don Cragun For This Post:
# 18  
Old 06-11-2014
please let me do some more testing and i will get back to you.
Btw i am using the Red Hat Enterprise Linux Server release 5.9 (Tikanga)

Thank you.

---------- Post updated 06-11-14 at 09:06 AM ---------- Previous update was 06-10-14 at 09:41 AM ----------

May i please know why do you need to explicitly close the shell script, wouldn't it exit by itself after execution when you pipe the output of the awk script.

close("./readcounts")

Thank you.

Last edited by Ariean; 06-10-2014 at 11:07 AM..
# 19  
Old 06-11-2014
Quote:
Originally Posted by Ariean
... ... ...

---------- Post updated 06-11-14 at 09:06 AM ---------- Previous update was 06-10-14 at 09:41 AM ----------

May i please know why do you need to explicitly close the shell script, wouldn't it exit by itself after execution when you pipe the output of the awk script.

close("./readcounts")

Thank you.
You tell us that awk is buffering output sent through a pipe. By closing the pipe to readcounts after writing each count, the buffering problem is eliminated. Unfortunately, it also means that readcounts has to be invoked for each count (instead of once for all counts).
This User Gave Thanks to Don Cragun For This Post:
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Parsing a log file and creating a report script

The log file is huge and lot of information, i would like to parse and make a report . below is the log file looks like: REPORT DATE: Mon Aug 10 04:16:17 CDT 2017 SYSTEN VER: v1.3.0.9 TERMINAL TYPE: prod SYSTEM: nb11cu51 UPTIME: 04:16AM up 182 days 57 mins min MODEL, TYPE, and SN:... (8 Replies)
Discussion started by: amir07
8 Replies

2. Shell Programming and Scripting

Issue in awk parsing under while loop

Hi I am trying to parse a grep output using awk. It works fine individually and not working under the loop with variable name assigned. cat > file.txt dict=/dictr/abcd/d1/wq:/dictr/abcd/d2/wq:/dictr/abcd/d3/wq: sample tried code Nos=`grep -w "dict" file.txt | awk -F"=" '{print... (10 Replies)
Discussion started by: ananan
10 Replies

3. Shell Programming and Scripting

Shell script not parsing complete file using AWK

Hi, I have shell script which will read single edi document and break data between ST & SE to separate files.Below example should create 3 separate files. I have written script with the below command and it is working fine for smaller files. awk -F\| -vt=`date +%m%d%y%H%M%S%s` \ ... (2 Replies)
Discussion started by: prasadm
2 Replies

4. Shell Programming and Scripting

Parsing out access.log with awk and grep

In part of my script I use awk to pull out the urls. awk '{print $8}' then I take them and send them to grep.` Some of them are straight .com/ or .org or whatever (address bar entries), while others are locations of images, js, etc. I'm trying to only pull any line that ends with .com/... (11 Replies)
Discussion started by: druisgod
11 Replies

5. Shell Programming and Scripting

Script for Parsing Log File

Working on a script that inputs an IP, parses and outputs to another file. A Sample of the log is as follows: I need the script to be able to input IP and print the data in an output file in the following format or something similar: Thanks for any help you can give me! (8 Replies)
Discussion started by: Winsarc
8 Replies

6. Shell Programming and Scripting

parsing issue with edi file

Hello, We have edi files we need to do some extra parsing on. There is a line that shows up that looks like this: GE|8,845|000000000 We need to parse the file, find the line ( that begins with GE "^GE" ), and remove the comma(s). What is the easiest way to do that ? I know I can grab... (5 Replies)
Discussion started by: fwellers
5 Replies

7. Shell Programming and Scripting

Log file issue within script

Hi, I have a script where it does several tasks and 3 of them being SQLPLUS activity. Within these SQLPLUS sessions, I have a spool file going but what ever is going on within each SQLPLUS session I would like to write it to my main log file where everything else is running. sqlplus -s <<... (2 Replies)
Discussion started by: ramangill
2 Replies

8. UNIX for Dummies Questions & Answers

Script for parsing details in a log file to a seperate file

Hi Experts, Im a new bee for scripting, I would ned to do the following via linux shell scripting, I have an application which throws a log file, on each action of a particular work with the application, as sson as the action is done, the log file would vanish or stops updating there, the... (2 Replies)
Discussion started by: pingnagan
2 Replies

9. Shell Programming and Scripting

Help with script parsing a log file

I have a large log file, which I want to first use grep to get the specific lines then send it to awk to print out the specific column and if the result is zero, don't do anything. What I have so far is: LOGDIR=/usr/local/oracle/Transcription/log ERRDIR=/home/edixftp/errors #I want to be... (3 Replies)
Discussion started by: mevasquez
3 Replies

10. Shell Programming and Scripting

Shell script for parsing 300mb log file..

am relatively new to Shell scripting. I have written a script for parsing a big file. The logic is: Apart from lot of other useless stuffs, there are many occurances of <abc> and corresponding </abc> tags. (All of them are properly closed) My requirement is to find a particular tag (say... (3 Replies)
Discussion started by: gurpreet470
3 Replies
Login or Register to Ask a Question