Sponsored Content
Top Forums Shell Programming and Scripting Need to get all the records from a log file greater than timestamp supplied. Post 302797223 by megh on Monday 22nd of April 2013 05:13:31 AM
Old 04-22-2013
It is giving a syntax error near line 1

---------- Post updated at 04:13 AM ---------- Previous update was at 03:49 AM ----------

I tried the below piece of code in ubuntu and it is working fine to fetch the timestamp,but in solaris 5.10 it is not working

Code:
line='[4/19/13 0:49:32:250 EDT] 00000026 ThreadMonitor W   WSVR0605W: Thread "WebContainer : 1" (00000027) has been active for 701879 milliseconds and may be hung.  There is/are 1 thread(s) in total in the server that may be hung.

Code:
echo $line|awk -F"[/ \\\][]" '{print  $5}'

any suggestions

Last edited by radoulov; 04-22-2013 at 06:41 AM..
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Spooling a log file with timestamp

Hi From shell script i am invoking sqlplus to connect to oracle database and then i spool a csv file as with output. What i want to do is to change the file name with timestamp on it so after spooling finish shell script change file name with time stamp. can someone help me to do that . Thanks... (2 Replies)
Discussion started by: ukadmin
2 Replies

2. Shell Programming and Scripting

List all log records logged after $timestamp ?

I am trying to find a way to list every records inside a file (usually a log file) that are present after a record mathing/greater-then a timestamp supplied by another script. The timestamp can be anywhere inside the record and it is usually in the standard `date` format (will not look for other... (5 Replies)
Discussion started by: Browser_ice
5 Replies

3. Shell Programming and Scripting

concatenate log file lines up to timestamp

Hi, Using sed awk or perl I am trying to do something similar to https://www.unix.com/shell-programming-scripting/105887-sed-awk-concatenate-lines-until-blank-line-2.html but my requirement is slightly different. What I am trying to accomplish is to reformat a logfile such that all lines... (4 Replies)
Discussion started by: AlanC
4 Replies

4. Shell Programming and Scripting

How to search backwards in a log file by timestamp of entries?

Hello. I'm not nearly good enough with awk/perl to create the logfile scraping script that my boss is insisting we need immediately. Here is a brief 3-line excerpt from the access.log file in question (actual URL domain changed to 'aaa.com'): 209.253.130.36 - - "GET... (2 Replies)
Discussion started by: kevinmccallum
2 Replies

5. Shell Programming and Scripting

Delete log file entries based on the Date/Timestamp within log file

If a log file is in the following format 28-Jul-10 ::: Log message 28-Jul-10 ::: Log message 29-Jul-10 ::: Log message 30-Jul-10 ::: Log message 31-Jul-10 ::: Log message 31-Jul-10 ::: Log message 1-Aug-10 ::: Log message 1-Aug-10 ::: Log message 2-Aug-10 ::: Log message 2-Aug-10 :::... (3 Replies)
Discussion started by: vikram3.r
3 Replies

6. Shell Programming and Scripting

AWK: Cannot read Number of records greater than 1(NR>1)

Hi all, I have a tab-delimited text file of size 10Mb. I am trying to count the number of lines using, grep -c . sample.txtor wc -l < sample.txt or awk 'END {print NR}' sample.txtAll these commands shows the count as 1, which means they are reading only the first header line of the file.... (3 Replies)
Discussion started by: mehar
3 Replies

7. Shell Programming and Scripting

Identifying files with a timestamp greater than a given timestamp

I need to be able to identify files with file timestamps greater than a given timestamp. I am using the following solution, although it appears to compare files at the "seconds" granularity and I need it at the milliseconds. When I tested my solution, it missed files that had timestamps... (3 Replies)
Discussion started by: nkm0brm
3 Replies

8. Shell Programming and Scripting

prepend timestamp to continiously updating log file

Hi, I have a process which outputs to a log. Below is the code snippet: process &> $LOGFILE& The log file keeps on updating whenever a transaction is processed. The log file has a time stamp added so every time I kill the process and start the process a new log file is... (4 Replies)
Discussion started by: rajkumarme_1
4 Replies

9. Shell Programming and Scripting

Changing information supplied when running file

In unix systems I can call `file` to return me the file type. file cel.vik $ cel.vik: ASCII text How can I append additional information when I create a file such that when I call `file` it returns me that additional information. (2 Replies)
Discussion started by: kristinu
2 Replies

10. UNIX for Beginners Questions & Answers

Filter records from a log file based on timestamp

Dear Experts, I have a log file that contains a timestamp, I would like to filter record from that file based on timestamp. For example refer below file - cat sample.txt Jan 19 20:51:48 mukul-Vostro-14-3468 systemd: pam_unix(systemd-user:session): session opened for user root by (uid=0)... (6 Replies)
Discussion started by: mukulverma2408
6 Replies
xfs_logprint(8) 					      System Manager's Manual						   xfs_logprint(8)

NAME
xfs_logprint - print the log of an XFS filesystem SYNOPSIS
xfs_logprint [ options ] device DESCRIPTION
xfs_logprint prints the log of an XFS filesystem (see xfs(5)). The device argument is the pathname of the partition or logical volume con- taining the filesystem. The device can be a regular file if the -f option is used. The contents of the filesystem remain undisturbed. There are two major modes of operation in xfs_logprint. One mode is better for filesystem operation debugging. It is called the transactional view and is enabled through the -t option. The transactional view prints only the portion of the log that pertains to recovery. In other words, it prints out complete transactions between the tail and the head. This view tries to display each transaction without regard to how they are split across log records. The second mode starts printing out information from the beginning of the log. Some error blocks might print out in the beginning because the last log record usually overlaps the oldest log record. A message is printed when the physical end of the log is reached and when the logical end of the log is reached. A log record view is displayed one record at a time. Transactions that span log records may not be decoded fully. OPTIONS
-b Extract and print buffer information. Only used in transactional view. -c Attempt to continue when an error is detected. -C filename Copy the log from the filesystem to the file filename. The log itself is not printed. -d Dump the log from front to end, printing where each log record is located on disk. -D Do not decode anything; just print data. -e Exit when an error is found in the log. Normally, xfs_logprint tries to continue and unwind from bad logs. However, sometimes it just dies in bad ways. Using this option prevents core dumps. -f Specifies that the filesystem image to be processed is stored in a regular file at device (see the mkfs.xfs(8) -d file option). This might happen if an image copy of a filesystem has been made into an ordinary file with xfs_copy(8). -l logdev External log device. Only for those filesystems which use an external log. -i Extract and print inode information. Only used in transactional view. -q Extract and print quota information. Only used in transactional view. -n Do not try and interpret log data; just interpret log header information. -o Also print buffer data in hex. Normally, buffer data is just decoded, so better information can be printed. -s start-block Override any notion of where to start printing. -t Print out the transactional view. SEE ALSO
mkfs.xfs(8), mount(8). xfs_logprint(8)
All times are GMT -4. The time now is 12:53 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy