Sponsored Content
Top Forums Shell Programming and Scripting Extracting data from large logs. Post 302126786 by Klashxx on Friday 13th of July 2007 04:53:30 AM
Old 07-13-2007
An awk solution:
Code:
awk '
BEGIN {i=1}
{
vec[i]=$0
i++
if ( $0 ~ /Pattern/ ) {
          for (j=i-8;j<i;j++)
             print vec[j]
          for (j=1;j<=3;j++)
             {
             getline
             print
             }
          i=1
          }
}' file


Last edited by Klashxx; 07-13-2007 at 06:20 AM..
 

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Awk Help for extracting report from logs

Hi I have a log file, with lines of following kind. ------------------------ 2009-05-15 07:49:42,574 INFO - SqlMapObjectDataDao - select - selectObject - 2 ms 2009-05-15 07:49:42,575 INFO - SqlMapUserDao - select - getUserSystemAdminSuperGroup - 0 ms 2009-05-15 07:49:42,576 INFO -... (3 Replies)
Discussion started by: jitendriya.dash
3 Replies

2. UNIX for Dummies Questions & Answers

restrict data from getting written to Logs

$SYBASE/bin/isql -U $DB_USERID -S $DB_SERVER << ! >> $OUTFILE `echo $DB_PASSWD` use $db go Print " The processing" go ! # Extract data to file echo $DB_PASSWD | $SYBASE/bin/bcp $WRK_DB..open out $CONV_DIR/open".csv -t\, -c -U $DB_USERID -S $DB_SERVER -b 1000 | tail -3 I am able to... (0 Replies)
Discussion started by: w020637
0 Replies

3. Shell Programming and Scripting

Extracting a portion of data from a very large tab delimited text file

Hi All I wanted to know how to effectively delete some columns in a large tab delimited file. I have a file that contains 5 columns and almost 100,000 rows 3456 f g t t 3456 g h 456 f h 4567 f g h z 345 f g 567 h j k lThis is a very large data file and tab delimited. I need... (2 Replies)
Discussion started by: Lucky Ali
2 Replies

4. Shell Programming and Scripting

Extracting specific lines of data from a file and related lines of data based on a grep value range?

Hi, I have one file, say file 1, that has data like below where 19900107 is the date, 19900107 12 144 129 0.7380047 19900108 12 168 129 0.3149017 19900109 12 192 129 3.2766666E-02 ... (3 Replies)
Discussion started by: Wynner
3 Replies

5. UNIX for Dummies Questions & Answers

help in extracting logs in readable format

hello everyone. newbie here in unix. I am trying to extract the logs of a certain job and would like to output it in a readable format, see below the CAT part: cat /var/opt/ctma/ctm/sysout/idwesct_sh30_eng_r6_cdcs_sh.LOG_05l0du_000* | egrep -i 'orderid:|file_name=' | sed... (1 Reply)
Discussion started by: eanne_may
1 Replies

6. UNIX for Dummies Questions & Answers

Extracting a block of text from a large file using variables?

Hi UNIX Members, I've been tasked with performing the following: Extract a block of data in column form #This data changes each time, therefore automating future procedures Please Note the following: line = reading a line from a file_list that leads to the data The filename is called... (16 Replies)
Discussion started by: Klor
16 Replies

7. Shell Programming and Scripting

Extracting logs using gunzip awk and gzip

Hi All I am trying to use a hard coded script into shell scripting but I am unable to . Kindly find the Script below along with the command Please help gunzip -c FilePath/FileName_*.gz | awk '$0 > "" && $0 < ""'|\ gzip >> FilePath/Outputfile.log.gz I Am trying to use this... (9 Replies)
Discussion started by: pulkitbahl
9 Replies

8. UNIX for Beginners Questions & Answers

Transpose large data in UNIX

Hi I have the following sample of data: my full data dimention is 900,000* 1119 rs987435 C G 1 1 1 0 2 rs345783 C G 0 0 1 0 0 rs955894 G T 1 1 2 2 1 rs6088791 ... (7 Replies)
Discussion started by: marwah
7 Replies

9. Programming

C++ help in large data set

Hi All, We are trying to replace a 3rdparty where we don't know how they handled the reader part here. The query below is getting 197 * 2038017 row in the table. In the below code we are trying to run the query and execute in the DB part and fetch and read the record. That is where it is... (1 Reply)
Discussion started by: arunkumar_mca
1 Replies
grablogs.conf(4)						   File Formats 						  grablogs.conf(4)

NAME
grablogs.conf - grablogs configuration for libgrablogs.so of the plugins of gnome-system-log file SYNOPSIS
/usr/lib/gnome-system-log/plugins/grablogs.conf DESCRIPTION
The libgrablogs.so is a plugin for gnome-system-log(1), it colloct the log files from the system as many as possible. grablogs.conf is a configuration file that contains a set of lines mixed with sh(1) syntax codes and individual log files. libgrablogs.so will read the file try to get a log files list for gnome-system-log(1). Users can copy the file into $HOME/.gnome2/gnome-system-log/plugins/`uname -p` to overwrite the system default one. The grablogs.conf file contains the following configuration categories: [configs] Each line under this category is interpreted as a config file of System. The plugin will open the config file and try to find all system paths of the logs. [commands] Each line under this category is interpreted as a shell command and will be execute through a pipe. And each line of the out- put of the command will be interpreted as a log path. [logs] Each line under this category is interpreted as a log path. FILES
/usr/lib/gnome-system-log/plugins/grablogs.conf The system default configuration file for the plugin libgrablogs.so $HOME/.gnome2/gnome-system-log/plugins/`uname -p`/grablogs.conf The user specific configuration file for the plugin libgrablogs.so EMAMPLE
[configs] /etc/syslog.conf [commands] for i in `svcs -aH -o FMRI | grep -v lrc `; do svcprop -p restarter/logfile $i 2>/dev/null || svcprop -q -p restarter/alt_logfile $i 2>/dev/null ; done [logs] /var/log/messages /var/log/secure /var/log/maillog /var/log/cron /var/log/Xorg.0.log /var/log/XFree86.0.log /var/log/auth.log /var/log/cups/error_log SEE ALSO
gnome-system-log(1), pipelog.conf(1) gnome-utils 2.16.0 13 Oct 2006 grablogs.conf(4)
All times are GMT -4. The time now is 10:52 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy