Script to search log file for last 15 mins data


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Script to search log file for last 15 mins data
# 1  
Old 09-26-2015
Script to search log file for last 15 mins data

Hi All,
I have an issue which I'm trying to understand a way of doing, I have several nodes which contain syslog events which I want to force trigger an email initially (eventually leading to another method of alerting but to start with an email).

Basically the syslog file will have hours worth of data in however I want to run a script every 15 minutes that searches the file but only for the last 15 minutes worth of data. The data in the file will appear like the below however note that other lines of data will be in the file and also their will be variable information after the semi-colon on each lines.

Code:
 Sat Sep 26 12:05:41 2015 Internal trap notification 1167 (MMES1AssocFail) MME S1 Association failed;
Sat Sep 26 12:07:50 2015 Internal trap notification 1168 (MMES1AssocEstab) MME S1 Association established;
Sat Sep 26 12:07:50 2015 Internal trap notification 1190 (MMES1PathEstab) MME S1 path established;
Sat Sep 26 12:26:55 2015 Internal trap notification 1189 (MMES1PathFail) MME S1 path failed;
Sat Sep 26 12:26:55 2015 Internal trap notification 1167 (MMES1AssocFail) MME S1 Association failed;
Sat Sep 26 12:27:04 2015 Internal trap notification 1168 (MMES1AssocEstab) MME S1 Association established;
Sat Sep 26 12:27:04 2015 Internal trap notification 1190 (MMES1PathEstab) MME S1 path established;
Sat Sep 26 12:27:26 2015 Internal trap notification 1189 (MMES1PathFail) MME S1 path failed;
Sat Sep 26 12:27:26 2015 Internal trap notification 1167 (MMES1AssocFail) MME S1 Association failed;

Does anyone have any examples where it would only look at the previous 15 mins based on the date and timestamp on each line ?

Thanks in advance.
# 2  
Old 09-26-2015
Is this something you plan to run every 15 minutes (so you just want to see stuff added since your last run)?

Does the log file you're examining rotate? If so, is the rotation synchronized with your script, or do you need to sometimes need to examine the end of an old log file in addition to the current log file?

Are you looking for the 15 minutes of data before the time on the clock when you start your script, or are you looking for the 15 minutes of data ending with the timestamp on the last entry in your log file?

What operating system and shell are you using?
# 3  
Old 09-27-2015
Reply

Quote:
Originally Posted by Don Cragun
Is this something you plan to run every 15 minutes (so you just want to see stuff added since your last run)?
Yes, I want it to run every 15 minutes and only add new events which have appeared in the last 15 minutes.

Quote:
Originally Posted by Don Cragun
Does the log file you're examining rotate? If so, is the rotation synchronized with your script, or do you need to sometimes need to examine the end of an old log file in addition to the current log file?
Yes the log file rotates at present its approximatly once a day, when the file rotates the previous one is also gziped. The rotation isn't synced with my script as its based on volume. If we can examine the old log file in the case of rotation that would be extremely helpful.


Quote:
Originally Posted by Don Cragun
Are you looking for the 15 minutes of data before the time on the clock when you start your script, or are you looking for the 15 minutes of data ending with the timestamp on the last entry in your log file?
15 minutes of data before the time on the clock.

Quote:
Originally Posted by Don Cragun
What operating system and shell are you using?
GNU/Linux, shell being used is bash.

Thanks
# 4  
Old 09-30-2015
Quote:
Originally Posted by mutley2202
Quote:
Originally Posted by Don Cragun
Is this something you plan to run every 15 minutes (so you just want to see stuff added since your last run)?
Yes, I want it to run every 15 minutes and only add new events which have appeared in the last 15 minutes.
OK. Note that this means that if there is any delay in starting one of your 15 minute runs, some events that appeared just over 15 minutes before the late starting run may be missed, and if the next run starts on time, some events may be picked up by two runs.
Quote:
Originally Posted by mutley2202
Quote:
Originally Posted by Don Cragun
Does the log file you're examining rotate? If so, is the rotation synchronized with your script, or do you need to sometimes need to examine the end of an old log file in addition to the current log file?
Yes the log file rotates at present its approximatly once a day, when the file rotates the previous one is also gziped. The rotation isn't synced with my script as its based on volume. If we can examine the old log file in the case of rotation that would be extremely helpful.
I assume that you realize that you need to grab any events logged to your log file after the last run of your script before you gzip it; or on the next run of your script you'll need to unzip it, run your script to gather events from the end of the old log file, then rezip it, and then have your script on the start of the new log file.
Quote:
Originally Posted by mutley2202
Quote:
Originally Posted by Don Cragun
Are you looking for the 15 minutes of data before the time on the clock when you start your script, or are you looking for the 15 minutes of data ending with the timestamp on the last entry in your log file?
15 minutes of data before the time on the clock.
As mentioned before, doing it this way means that you may miss some events and may process some events twice. I strongly suggest that instead of trying to match based on timestamps you instead keep track of the line number of the last line processed in the previous run and on the next run just start processing with the next line in that log file. Doing it this way will keep you from missing events and keep you from processing some events twice.

But, if you want to do it just based on timestamps, you can use the GNU date utility's -d option with an option-argument of "now - 15 minutes" and a format of "+%s" to get the number of seconds since the Epoch for 15 minutes ago and also use date -d with an option-argument of the 2nd, 3rd, 5th, and 4th fields in your log file (month, day, year, and hr:min:sec) with the same format string and then select events where the seconds since the Epoch 15 minutes ago is less than the timestamp in the file. (Note that you also want to reject any events where the timestamp in the file is more than 900 seconds after your start time. Events meeting this criteria occurred after your script started and should be picked up by the next run of your script instead of by this run.)
Quote:
Originally Posted by mutley2202
Quote:
Originally Posted by Don Cragun
What operating system and shell are you using?
GNU/Linux, shell being used is bash.

Thanks
Hope this helps.
# 5  
Old 10-01-2015
Response

Hi Don,
Thanks for the reply, so do you have any examples of this which I could use for reference (with your points above included) ? my expierence in this area isnt great hence the questions but hopefully with some guidence I can learn/understand.

Thanks
# 6  
Old 10-03-2015
Quote:
I strongly suggest that instead of trying to match based on timestamps you instead keep track of the line number of the last line processed in the previous run and on the next run just start processing with the next line in that log file. Doing it this way will keep you from missing events and keep you from processing some events twice.
Furthur to Don_Cragun, and if I have understood the task correctly.

You need a script running under cron every 15 minutes which copies the current logfile and compares the line count with the previous version of the logfile. The script then processes the "tail -#" number of lines difference.

This is a basic technique to avoid "tail -f" on logfiles which will get deleted by log rotation scripts. It also avoids reading the logfile and processing all the timestamps in the logfile.

Don't forget the "first time" condition when the T-15 minutes copy does not exist. Your first run of the cron will just prepare the files.
# 7  
Old 10-06-2015
You might use something like the following as a starting point. This script can be used both to rotate log files and to extract unreported entries from old and new log files, to compress old log files (after extracting unreported entries), and to send email containing the extracted unreported entries to email addresses.

This was written and tested using a 1993 version of the Korn shell, but will also work with a recent bash. Obviously, you'll have to adjust variables naming the directory in which your log file is located and the name of your log file. This code assumes that the zipped old log files are to be kept in the same directory. You'll have to make adjustments if you want to move the zipped files to another directory or if you don't like the timestamp I chose as the extension used to name old log files.

Note that the first time you run this script it, it will mail out the entire log file. After that it will keep track of what it reported on the previous run and just mail out entries added since the last run.

Code:
#!/bin/ksh
ec=0				# Final exit code.
IAm="${0##*/}"			# Basename of this program.
LOGDIR="/path/to/log/directory"	# Directory containing log files.
LOGFILE="syslog"		# Name of log file.
LOGPAT="[.][2-9][0-9][0-9][0-9][01][0-9][0-3][0-9]-[0-2][0-9]:[0-5][0-9]:[0-5][0-9]"
STATUSFILE="$LOGFILE.spot"	# Status file
TMPF="$IAm.$$"			# Temp file to hold extracted log entries.
Usage="SYNOPSIS
    $IAm [-hr]"			# Synopsis for this program
Help="NAME	$IAm -- Rotate and extract recent entries from log files

$Usage

DESCRIPTION
    The $IAm utility shall process the log file:
	$LOGDIR/$LOGFILE
    in various ways.  With the -r option, the current log file will be renamed
    by appending a timestamp to the end of the log file filename and a new log
    file shall be created.

    When invoked without options, $IAm shall extract log file entries from
    renamed log files and the current log file and mail them to selected
    administrators.  If any renamed log files are present, $IAm shall zip
    those renamed log files after the log entries are extracted.

OPTIONS
   -h	Help.  Print this help message and exit.
   -r	Rotate log files.  If the current log file is not an empty file, the
    	current log file shall be moved to a file with an extension representing
	the current date in the format:
	    $LOGDIR/$LOGFILE.YYYYMMDD-hh:mm:ss

INPUT FILES
    $LOGDIR/$LOGFILE
    	Current log file.

    $LOGDIR/$STATUSFILE
    	$IAm status file.

    $LOGDIR/$LOGFILE.datestamp
    	Old logfile(s).

OUTPUT FILES
    $LOGDIR/$STATUSFILE
    	Update $IAm status file.

    $LOGDIR/$LOGFILE.datestamp.gz
    	Zipped old logfile(s).

EXIT STATUS
    0	Successful commpletion.
    >0	An error occurred.

APPLICATION USAGE
    Note that this application will only work if each entry written to the log
    file is performed as a single write operation and the file descriptor used
    to write those entries is opened (for appending) before an entry is written
    and closed after each entry is written."

# Move to the directory containing the log and status files..
cd "$LOGDIR" || exit 1

# Process command-line arguments...
while getopts hr name
do	case "$name" in
	(h)	printf '%s\n' "$Help"
		exit 0;;
	(r)	if [ -s "$LOGFILE" ]
		then	mv "$LOGFILE" "$LOGFILE.$(date '+%Y%m%d-%T')" &&
			>> "$LOGFILE" || ec=2
		fi
		exit $ec;;
	(?)	printf '%s\n' "$Usage" >&2
		exit 3;;
	esac
done
shift $((OPTIND - 1))
if [ $# -ne 0 ]
then	printf '%s: No operands expected.\n%s\n' "$IAm" "$Usage" >&2
	exit 4
fi

# Get last line number processed from the status file...
if [ ! -r "$STATUSFILE" ]
then	printf '%s: WARNING: Status file not found.  Resetting to line 0.\n' \
	    "$IAm" >&2
	if ! echo 0 > "$STATUSFILE"
	then	printf "$IAm: Can't create status file.  Exiting.\n" "$IAm" >&2
		exit 5
	fi
	last=0
else	read -r last < "$STATUSFILE"
fi

# Get list of unzipped log files to process...
list=( "$LOGFILE"$LOGPAT "$LOGFILE" )
if [ "${list[0]}" = "$LOGFILE$LOGPAT" ]
then	# There are no old log files, reset the list...
	list=( "$LOGFILE" )
fi

# Extract entries added since last run and update status file...
awk -v last="$last" -v StatusFile="$STATUSFILE" '
FNR == 1 && filecount++ {
	# We have found the 1st line in a log file after the 1st log file,
	# reset "last" so we extract all entries from this log file.
	last = 0
}
FNR > last # Extract entries that have not been included in earlier reports.
END {	# Update status file.
	print FNR > StatusFile
}' "${list[@]}" > "$TMPF" || ec=$((ec + 1))

# Compress any uncompressed old log files...
for ((i = $((${#list[@]} - 2)); i >= 0; i--))
do	gzip "${list[$i]}" || ec=$((ec + 1))
done

# Send mail reporting status of this run...
if [ -s "$TMPF" ]
then	subject="New log entries found $(date)"
else	subject="No new log entries found $(date)"
fi
mailx -s "$subject" user@abc.com user2@xyz.cm < "$TMPF" || ec=$((ec + 1))
rm -f "$TMPF"
exit $ec

 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Script to find directory is getting files in every 10 mins, if not then when last time file received

Dears, I am looking for a script which will work as a watch directory. I ha directory which keep getting files in every 10 mins and some time delay. I want to monitor if the directory getting the files in every 10 mins if not captured the last received file time and calculate the delay. ... (6 Replies)
Discussion started by: sadique.manzar
6 Replies

2. Shell Programming and Scripting

Script to search for a pattern in 30 minutes from a log file

Hello All, I have to write a script which will search for diffrent patterns like "Struck" "Out of Memory" , etc from a log file in Linux box's. Now I will be executing a cron job to find out the results by executing the script once in every 30 minutes. suppose time is 14-04-29:05:31:09 So I... (3 Replies)
Discussion started by: Shubhasis Mathr
3 Replies

3. Shell Programming and Scripting

Script (ksh) to get data in every 30 mins interval for the given date

Hello, Since I m new to shell, I had a hard time to sought out this problem. I have a log file of a utility which tells that batch files are successful with timestamp. Given below is a part of the log file. 2013/03/07 00:13:50 Apache/1.3.29 (Unix) configured -- resuming normal operations... (12 Replies)
Discussion started by: rpm120
12 Replies

4. Shell Programming and Scripting

Averaging data every 30 mins using AWK

A happy Monday to you all, I have a .csv file which contains data taken every 5 seconds. I want to average these 5 second data points into 30 minute averages! date co2 25/06/2011 08:04 8.31 25/06/2011 08:04 8.32 25/06/2011 08:04 8.33... (18 Replies)
Discussion started by: gd9629
18 Replies

5. AIX

Grep last 5 mins from log file in AIX

I want to grep only last 5 mins of a log file in bash I have a syslog which contains the following Mon Jul 11 20:47:42 Mon Jul 11 20:47:52 The following works in Unix but not in AIX . Please can you let me know as to what would be the AIX equivalent Code: for (( i = 5; i >=0;... (1 Reply)
Discussion started by: necro98
1 Replies

6. Shell Programming and Scripting

Retrieve logs generated in last 10 mins from a log file using 'grep' command

HI All, I have a log file where the logs will be in the format as given below: 2011-05-25 02:32:51 INFO PROCESS STARTING 2011-05-25 02:32:52 INFO PROCESS STARTED . . . I want to retrieve only the logs which are less than 5 mins older than current time using grep... (3 Replies)
Discussion started by: rvhg16
3 Replies

7. Shell Programming and Scripting

need a shell script to extract data from a log file.

If I have a log like : Mon Jul 19 05:07:34 2010; TCP; eth3; 52 bytes; from abc to def Mon Jul 19 05:07:35 2010; UDP; eth3; 46 bytes; from aaa to bbb Mon Jul 19 05:07:35 2010; TCP; eth3; 52 bytes; from def to ghi I will need an output like this : Time abc to def... (1 Reply)
Discussion started by: hitha87
1 Replies

8. Shell Programming and Scripting

Script which will search for a file for 15 mins

Hi All, I would like to write a script which will search a file say abc.dat in /a/b/data for 15 mins only. If the script finds the file in 15 mins then it will exit will exit sucessfully and if there is no file for 15 mins it will exit and copy the last day file (abc.dat_ddmmyyhhmmss) from... (1 Reply)
Discussion started by: chandancsc
1 Replies

9. Shell Programming and Scripting

{How} Script to search a log file for a given criteria

I have to write a script to search the logfiles i.e msg.log for the following The Search Criteria is as follows 1. IP address 2. String Ex: abc.123.com 3. Timestamp ( start - end ) ex: 2008-05-04-00:30:00 - 2008-05-08-04:30:00 Can anyone help to devise a script for... (9 Replies)
Discussion started by: indiakingz
9 Replies

10. Shell Programming and Scripting

shell-script which extract data from log file

give me a shell-script which extract data from log file on a server by giving date and time as input (for both start time and end time) and it will give the logs generated during the given time as output. (4 Replies)
Discussion started by: abhishek27
4 Replies
Login or Register to Ask a Question