Searching exception keyword in all logs in last 5 minutes
Hello Folks ,
I am a new bie to the world of unix , what i am planning to do is the I have the location in server to which i am access through the putty and the location is
and at this location the files are listed as show
Now please advise the unix command which can extract the last 5 minutes logs that i want to search the keyword Exception if it is there in last 5 minutes in logs.
Now please advise the unix command which can extract the last 5 minutes logs that i want to search the keyword Exception if it is there in last 5 minutes in logs.
For this we would have to know the contents of said files. Usually logs are files that are written to constantly and what would be written within these last fieve minutes we (and all others, for that matter) don't know - without looking into it.
So post a sample of the files contents and maybe we can find some pattern by which to discern "within-the-last-five-minutes"- and "older"-types of messages.
At any rate, you can use the grep utility to search for a certain pattern (like the word "exception") in a file. The following will search for either "exception" or "Exception" in a file and display all lines containing it to the screen:
Serching exception in all logs of a particular directory
Hello Folks ,
I am a new bie to the world of unix , what i am planning to do is the I have the location in server to which i am access through the putty and the location is
and at this location contain the various log files some of them are listed as show
Now please advise the unix command which can extract the last 5 minutes logs from all the different logs listed in this particular location , as i want to search the keyword Exception if it is there in last 5 minutes in all logs.
Below is the way that logs have been written as I have open them in VI editor , also please cross check the timestamp fashion along with each line
Moderator's Comments:
edit by bakunin: i figured this post belongs to the same thread as the first one here. Please follow the rule "1 theme - 1 thread" and open new threads only when you have a genuinely new problem. Otherwise just use the thread you have already open. Than you for your consideration.
@punpun26262626: Welcome to the forum.
It is usually well received in here if you put some more effort into formulating the spec than "Now please advise". What OS / shell / tools versions do you use? What thoughts / logics lead to the desired result? Any attempts from your side on the solution?
@sadique.manzar: nice idea, with three drawbacks:
- the patterns will match more than "last 5 minutes", e.g. the day(s) before, or matching "min:sec" values (in above: /07:40/), or any other similar data.
- the key word "Exception" is requested.
- no "command substitution" for the final sed command.
Combining the two proposals this far we come up with
Even IF those were sorted, the approach with the single threshold timestamp would require that value to occur verbatim in the logs, so the latter need at least one entry per minute. And, wouldn't the log file need to be tacced to retrieve the last five minutes, and then quit?
I want to extract the logs between the current time stamp and 15 minutes before and sent an email to the people configured. I developed the below script but it's not working properly; can someone help me?? I have a log file containing this pattern:
Constructor QuartzJob
... (3 Replies)
I have a log file with the below contents :
log_file_updated.txt :
Jul 5 03:33:06 rsyslogd: was
Jul 5 03:33:09 adcsdb1 rhsmd: This system is registered.
Sep 2 02:45:48 adcsdb1 UDSAgent: 2015-07-05 04:24:48.959 INFO Worker_Thread_4032813936 Accepted connection from host <unknown>... (3 Replies)
HI Everyone,
My task is to search error messages last 10 minutes everytime from a log file.
My script,
date1=`date -d '10 minutes ago' "+%H:%M:%S"`
date2=`date "+%H:%M:%S"`
awk -v d1="${date1}" -v d2="${date2}" '$0~d1{p=1} $0~d2{p=0} p' filename
No error getting in... (3 Replies)
Hi folks,
I have logs folder in which different type of logs are generated , I am monitoring them by the below command
tail -f *.log
but I want that if exception come in any of the logs then it should be catch so what i should prefix with tail -f *.log so that it imeediatley catches and... (3 Replies)
Hi Folks,
I have just basic queries is that suppose I have to monitor the logs then there is a command , suppose I have to monitor the abc.log which is updating dynamically within seconds so the command will be after going to that directory is .. tail -f abc.log
Now please advise what about... (1 Reply)
Hi Folks,
please advise , I have logs generated on unix machine at location /ops/opt/aaa/bvg.log , now sometimes there come exception in these logs also, so I want to write such a script such that it should continuously monitor these logs and whenever any exception comes that is it try to find... (3 Replies)
Hey just need one simple syntax to search for the string from the Live Running Logs. The strings are placed in a $infile & everytime the script should pull each string from $infile and should provide as an input for grepping from Live running logs on a rotational basis.
So here are the Contents... (14 Replies)
Hi all,
I am in the process of building a shell script as part of a auditing utility. It will search a specified directory for keywords and output results of the file path, and line number that the word was found on. I built a test script (shown below) that does just this, but egrep apparently... (0 Replies)
I have been trying to search for a string from close to 200 *.gz file, But i get a error. Can someone suggest a bulletproof solution Please.
zgrep 20/Aug/2008:13:50:23 request.log.*.gz
-bash: /usr/bin/zgrep: /bin/sh: bad interpreter: Argument list too long
also
zgrep 20/Aug/2008:13:50:23... (9 Replies)
Unix based fix-it needed?
Platform and feature: search programs on Apple computers (Leopard or Tiger; 10.4 and above; Spotlight)
Problem: the document search feature of these programs produce hits when keyword(s) used appear anywhere in the document's content.
Change required: we need to... (1 Reply)