Combine multiple unique lines from event log text file into one line, use PERL or AWK?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Combine multiple unique lines from event log text file into one line, use PERL or AWK?
# 1  
Old 11-06-2012
Linux Combine multiple unique lines from event log text file into one line, use PERL or AWK?

I can't decide if I should use AWK or PERL after pouring over these forums for hours today I decided I'd post something and see if I couldn't get some advice.

I've got a text file full of hundreds of events in this format:

Record Number : 1
Records in Seq : 1
Offset in Seq : 1
Time : DD/MM/YY 00:00:00
Vendor ID : XXXXXXX
Application ID : XXXXX
Application Version : XXXXX
API Library : XXX
API Version : XXXX
Host Name : HOST1
OS Name : UNIX
OS Revision : Ver2
Client Host :
Process ID : 00000000
Task ID : 00000000
Function Class : N/A
Action Code : CODE
Text : Any length of description could potentially go here
Initialization type : Any length of description here also
Username : USERID
Activity ID : ACTIVITYID

Each one of these events is 20 lines long with the last line being activity ID. Then a blank space, and the start of a new event.

I want to do the following:
1) Open a text file. Remove the first 5 lines as they contain information that is not needed.
2) Combine all 20 lines of the event into one string, with a delimiter of '^' between each header. I chose the '^' because thats the only special character I could find that WASN'T in this file that was unique. Ideally it would look like this after being collapsed:

Record Number : 1 ^ Records in Seq : 1 ^Offset in Seq : 1 ^ Time : DD/MM/YY 00:00:00 ^ Vendor ID : XXXXXXX ^Application ID : XXXXX ^ Application Version : XXXXX ^ API Library : XXX ^ API Version : XXXX ^ Host Name : HOST1 ^ OS Name : UNIX etc etc

2) Once it gets to the end of one event, I want it to loop, read the next 20 lines as a new event to collapse and proceed through the entire txt file that way.
3) It would be nice to get rid of the spaces after the ':' but not necessary.
4) If I can get that far I then have to set this up so I only pull the most recent events from the file (i.e. That days records only)

I am new to both Perl and AWK.....more experience with Python. I have poured over these forums, and tutorial pages and can find many examples on how to collapse lines but they lack detail as to what each command does. I'm solution agnostic so whether its perl or awk as long as someone can explain all the little pieces of their code so I can LEARN it, understand it, and possibly modify it in the future! The more info the better! Thanks in advance....
# 2  
Old 11-06-2012
The awk is more than adequate for the task. You can utilize ORS feature - output record separator, to re-place usual line feed with ^ character to string the log lines together. Another built-in NF - number of fields - will tell you when the new section is coming. Yet another built-in NR - record number - will help to bypass firts 5 lines.

As an example:

Code:
awk '{if(NR<=5)continue; if(NF>0){ ORS="^"}else{ ORS="\n"} print $0}'  your_log_file

# 3  
Old 11-06-2012
Code:
tail +6 yourfile | awk '{$1=$1;gsub(" *: *",":")}1' RS= FS="\n" OFS="^"

(depending on your OS you may want to try : tail -n +6 yourfile | awk .... )
Code:
awk 'NR<2{sub(".*"$6,$6)}{$1=$1;gsub(" *: *",":")}1' RS= FS="\n" OFS="^" yourfile


Last edited by ctsgnb; 11-06-2012 at 09:30 PM..
# 4  
Old 11-06-2012
A method to filter any number of items (and non-consecutive) using awk:

put the following in a file called myawk.awk (or whatever)
Code:
BEGIN {
    FS = " : ";
    Skip["Record Number"] = 1;
    Skip["Records in Seq"] = 1;
    # put whatever you want filtered out using the above format                                                    
}
{ if ($1 == "") { printf("\n") } else if (!($1 in Skip)) { printf("^%s:%s",$1,$2) } }

Code:
awk -f myawk.awk your_log_file

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

awk with sed to combine lines and remove specific odd # pattern from line

In the awk piped to sed below I am trying to format file by removing the odd xxxx_digits and whitespace after, then move the even xxxx_digit to the line above it and add a space between them. There may be multiple lines in file but they are in the same format. The Filename_ID line is the last line... (4 Replies)
Discussion started by: cmccabe
4 Replies

2. Shell Programming and Scripting

awk to remove lines that do not start with digit and combine line or lines

I have been searching and trying to come up with an awk that will perform the following on a converted text file (original is a pdf). 1. Since the first two lines are (begin with) text they are removed 2. if $1 is a number then all text is merged (combined) into one line until the next... (3 Replies)
Discussion started by: cmccabe
3 Replies

3. Shell Programming and Scripting

awk to combine lines from line with pattern match to a line that ends in a pattern

I am trying to combine lines with these conditions: 1. First line starts with text of "libname VALUE db2 datasrc" where VALUE can be any text. 2. If condition1 is met then continue to combine lines through a line that ends with a semicolon. 3. Ignore case when matching patterns and remove any... (5 Replies)
Discussion started by: Wes Kem
5 Replies

4. Shell Programming and Scripting

Awk: Combine multiple lines based on number of fields

If a file has following kind of data, comma delimited 1,2,3,4 1 1 1,2,3,4 1,2 2 2,3,4 My required output must have only 4 columns with comma delimited 1,2,3,4 111,2,3,4 1,222,3,4 I have tried many awk command using ORS="" but couldnt progress (10 Replies)
Discussion started by: mdkm
10 Replies

5. Shell Programming and Scripting

Combine multiple lines into single line

Hi All , I have a file with below data # User@Host: xyz @ # Query_time: t1 Lock_time: t2 Rows_sent: n1 Rows_examined: n2 SET timestamp=1396852200; select count(1) from table; # Time: 140406 23:30:01 # User@Host: abc @ # Query_time: t1 Lock_time: t2 Rows_sent: n1 Rows_examined:... (6 Replies)
Discussion started by: rakesh_411
6 Replies

6. Shell Programming and Scripting

Compare multiple files, identify common records and combine unique values into one file

Good morning all, I have a problem that is one step beyond a standard awk compare. I would like to compare three files which have several thousand records against a fourth file. All of them have a value in each row that is identical, and one value in each of those rows which may be duplicated... (1 Reply)
Discussion started by: nashton
1 Replies

7. Shell Programming and Scripting

Combine multiple lines in file based on specific field

Hi, I have an issue to combine multiple lines of a file. I have records as below. Fields are delimited by TAB. Each lines are ending with a new line char (\n) Input -------- ABC 123456 abcde 987 890456 7890 xyz ght gtuv ABC 5tyin 1234 789 ghty kuio ABC ghty jind 1234 678 ght ... (8 Replies)
Discussion started by: ratheesh2011
8 Replies

8. Shell Programming and Scripting

Combine multiple lines in single line

This is related to one of my previous post but now with a slight difference: I need the "Updated:" to be in one line as well as the "Information:" on one line as well. These are in multiple lines right now as seen below. These can have 2 or more lines that needs to be in one line. System name:... (8 Replies)
Discussion started by: The One
8 Replies

9. Shell Programming and Scripting

awk, perl Script for processing a single line text file

I need a script to process a huge single line text file: The sample of the text is: "forward_inline_item": "Inline", "options_region_Australia": "Australia", "server_event_err_msg": "There was an error attempting to save", "Token": "Yes", "family": "Family","pwd_login_tab": "Enter Your... (1 Reply)
Discussion started by: hmsadiq
1 Replies

10. Shell Programming and Scripting

extracting unique lines from text file

I have a file with 14million lines and I would like to extract all the unique lines from the file into another text file. For example: Contents of file1 happy sad smile happy funny sad I want to run a command against file one that only returns the unique lines (ie 1 line for happy... (3 Replies)
Discussion started by: soliberus
3 Replies
Login or Register to Ask a Question