I am having problem with parsing a data from the huge log file. the log file is an application log with around 5 Gb in size and it rotates every midnight.
Now if the application encountered such issue, it sends an email with a specific info but without further details. So I need to login to the server and do some grepping on the log file but I am having a hard time to get a good and accurate parsing results.
Since I dont have the exact log file. I am providing a sample log which I got from /var/log/messages of my linux box.
Usually the email that I received has a specific timestamp which I need to get the exact location of the info.
"06:10:28 mymachine kernel: This is it:: BIOS INFO 123: Get Bios info on the upper part of the log"
Based on that info above I need to get some details from the log.
That info is what I need but that info is on the upper part of the log which I don't know how to crawl the parsing going up when I found the string "BIOS INFO 123: Get Bios info on the upper part of the log"
So my requirements is to grep based on timestamp with a string BIOS INFO 123: Get Bios info on the upper part of the log, get the info on the upper part of the log and print the 3 infos which I need. By the way it doesn't tell how may lines where the 3 info are located based on the string which I need to get first.
Here's a sample log I hope anyone can help me. thanks a lot.
Oh, yes. But because your file is really huge it's better to embed this check in awk:
===
You can't use grep, it will search the whole file, but you need quit after getting your lines (just imagine that your information in the first 100 kilo). But I'm afraid the above awk solution would be slow because of the string regex. But you can embed the time variable (you need a variable for easy further automation) in the awk regex literal:
Hi
I have a system running uname -a
Linux cmovel-db01 2.6.32-38-server #83-Ubuntu SMP Wed Jan 4 11:26:59 UTC 2012 x86_64 GNU/Linux
I would like to capture the contents of /var/log/syslog from 11:00AM to 11:30AM and sent to this info via email.
I was thinking in set a cron entry at that... (2 Replies)
unix : sun
shell : bash
i need to select multiple rows with this format :
<special format>
10 lines
/<special format>
from log file that have lots of info
i thought of getting the number of the first line using
grep -n "special format" file | cut -d: -f1
then pass it to shell... (2 Replies)
I have a LOG file which looks like this
Import started at: Mon Jul 23 02:13:01 EDT 2012
Initialization completed in 2.146 seconds.
--------------------------------------------------------------------------------
--
Import summary for Import item: PolicyInformation... (8 Replies)
My intention is to log the output to a file as well as it should be displayed on the console > I have used tee ( tee -a ${filename} ) command for this purpose. This is working as expected for first few outputs, after some event loggin nothing is gettting logged in to the file but It is displaying... (3 Replies)
Hi Experts,
I had to edit (a particular value) in header line of a very huge file so for that i wanted to search & replace a particular value on a file which was of 24 GB in Size. I managed to do it but it took long time to complete. Can anyone please tell me how can we do it in a optimised... (7 Replies)
Hi,
I have a log file that contains information such as this:
date
id number
command1
command2
command3
command4
data
data
data
date
id number
command1
command2
command3
command4 (4 Replies)
Looking for a shell script or a simple perl script . I am new to scripting and not very good at it .
I have 2 directories . One of them holds a text file with list of files in it and the second one is a daily log which shows the file completion time. I need to co-relate both and make a report.
... (0 Replies)
I have a file with data extracted, and need to insert a header with a constant string, say: H|PayerDataExtract
if i use sed, i have to redirect the output to a seperate file like
sed ' sed commands' ExtractDataFile.dat > ExtractDataFileWithHeader.dat
the same is true for awk
and... (10 Replies)
I've got a 2.2 Gig syslog file from our Cisco firewall appliance. The problem is that we've been seeing gaps in the syslog for anywhere from 10 minutes to 2 hours. Currently I've just been using 'less' and paging through the file to see if I can find any noticeable gaps. Obviously this isn't the... (3 Replies)