Need to remove the duplicate lines from a log!!


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Need to remove the duplicate lines from a log!!
# 1  
Old 12-20-2011
Question Need to remove the duplicate lines from a log!!

Hello Folks,

Can some one help me with the removal of duplicate lines from a log file and send it to another log file. It's bit complicated as two lines are same but only difference is the timestamp, but some lines are uniq. Line has been seperated by colon's.

Log file:
=================================================

FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu gross, 22 Logs nach 10 Laeufen!!! #####:TM 2011.12.20-12:40:43

FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu klein, 7 Logs #####:TM 2011.12.19-16:35:36

FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu gross, 22 Logs nach 10 Laeufen!!! #####:TM 2011.12.16-12:42:55

FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu klein, 7 Logs #####:TM 2011.12.15-16:32:46

FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu klein, 7 Logs #####:TM 2011.12.15-12:33:36
==================================================================

It should remove the duplicate lines and display atleast like this & send this data to another file.
==================================================================
FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu gross, 22 Logs nach 10 Laeufen!!! #####:TM 2011.12.20-12:40:43

FORTE:APP:ERROR:nachversorgung.sh logarchivloeschung.sql FP:CRON:for3 :W:0001:Fehler: ##### Fehler in logarchivloeschung, Anzahl Redologs zu klein, 7 Logs #####:TM 2011.12.19-16:35:36

==================================================================

Only difference b/w lines is time stamp. Hope i m clear..
# 2  
Old 12-20-2011
If you're okay with the new file not including timestamps, it would be really simple:
Code:
sed 's/#####:TM.*//' file | sort -u > newfile

If you need timestamps, I can offer this BASH snippet, but it can only get rid of duplicate ADJACENT lines:
[EDIT: Removed. Compared to the elegance of CarloM's awk, this was ridiculous]

I'm sure there's a better way. I think some awk-master is gonna need to post for a perfect solution.

Last edited by ryran; 12-20-2011 at 12:27 PM..
This User Gave Thanks to ryran For This Post:
# 3  
Old 12-20-2011
Code:
awk -F"TM" '{lines[$1]=$2} END {for (i in lines) {print i lines[i]}}' inputfile

or
Code:
awk -F"TM" '!(lines[$1]) {lines[$1]=$2} END {for (i in lines) {print i lines[i]}}' inputfile

Quote:
Originally Posted by ryran
If you need timestamps, I can offer this BASH snippet, but it will only get rid of duplicate ADJACENT lines
You could sort the file on input.
These 2 Users Gave Thanks to CarloM For This Post:
# 4  
Old 12-20-2011
The first occurrence is of a line along with its time stamp is retained.
Code:
perl -ane '$t=pop @F;$r=join(" ",@F);if($.==1){$x{$r}=$t}for(keys %x){unless($k eq $r){$x{$r}=$t}}END{for(keys %x){print "$_ $x{$_}\n"}}' inputfile

This User Gave Thanks to balajesuri For This Post:
# 5  
Old 12-20-2011
Quote:
Originally Posted by CarloM
Code:
awk -F"TM" '{lines[$1]=$2} END {for (i in lines) {print i lines[i]}}' inputfile
awk -F"TM" '!(lines[$1]) {lines[$1]=$2} END {for (i in lines) {print i lines[i]}}' inputfile

Man that's awesome.

To the original poster: Carlo could give all the details, but the big difference I see between the two solutions he gave is that one grabs the first occurrence of a line, and the other grabs the last (meaning, you'll either get the first timestamp, or--probably more helpful--the timestamp of the last occurrence, depending on which you pick). The impressive perl solution also grabs the last occurrence.

Oh yeah, but your file is backwards from traditional logfiles--I forgot. Anywhoo.. I'm sure you'll work it out. Glad to read these solutions.
# 6  
Old 12-21-2011
Thanks Everyone. All the solutions are working Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to remove duplicate lines?

Hi All, I am storing the result in the variable result_text using the below code. result_text=$(printf "$result_text\t\n$name") The result_text is having the below text. Which is having duplicate lines. file and time for the interval 03:30 - 03:45 file and time for the interval 03:30 - 03:45 ... (4 Replies)
Discussion started by: nalu
4 Replies

2. Shell Programming and Scripting

Remove lines containing 2 or more duplicate strings

Within my text file i have several thousand lines of text with some lines containing duplicate strings/words. I would like to entirely remove those lines which contain the duplicate strings. Eg; One and a Two Unix.com is the Best This as a Line Line Example duplicate sentence with the word... (22 Replies)
Discussion started by: martinsmith
22 Replies

3. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies

4. UNIX for Dummies Questions & Answers

Remove Duplicate Lines

Hi I need this output. Thanks. Input: TAZ YET FOO FOO VAK TAZ BAR Output: YET VAK BAR (10 Replies)
Discussion started by: tara123
10 Replies

5. Shell Programming and Scripting

Remove lines with duplicate first field

Trying to cut down the size of some log files. Now that I write this out it looks more dificult than i thought it would be. Need a bash script or command that goes sequentially through all lines of a file, and does this: if field1 (space separated) is the number 2012 print the entire line. Do... (7 Replies)
Discussion started by: ajp7701
7 Replies

6. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

7. Shell Programming and Scripting

Remove duplicate lines

Hi, I have a huge file which is about 50GB. There are many lines. The file format likes 21 rs885550 0 9887804 C C T C C C C C C C 21 rs210498 0 9928860 0 0 C C 0 0 0 0 0 0 21 rs303304 0 9941889 A A A A A A A A A A 22 rs303304 0 9941890 0 A A A A A A A A A The question is that there are a few... (4 Replies)
Discussion started by: zhshqzyc
4 Replies

8. Shell Programming and Scripting

remove duplicate lines using awk

Hi, I came to know that using awk '!x++' removes the duplicate lines. Can anyone please explain the above syntax. I want to understand how the above awk syntax removes the duplicates. Thanks in advance, sudvishw :confused: (7 Replies)
Discussion started by: sudvishw
7 Replies

9. Shell Programming and Scripting

how to remove duplicate lines

I have following file content (3 fields each line): 23 888 10.0.0.1 dfh 787 10.0.0.2 dssf dgfas 10.0.0.3 dsgas dg 10.0.0.4 df dasa 10.0.0.5 df dag 10.0.0.5 dfd dfdas 10.0.0.5 dfd dfd 10.0.0.6 daf nfd 10.0.0.6 ... as can be seen, that the third field is ip address and sorted. but... (3 Replies)
Discussion started by: fredao
3 Replies

10. Shell Programming and Scripting

Remove Duplicate Lines in File

I am doing KSH script to remove duplicate lines in a file. Let say the file has format below. FileA 1253-6856 3101-4011 1827-1356 1822-1157 1822-1157 1000-1410 1000-1410 1822-1231 1822-1231 3101-4011 1822-1157 1822-1231 and I want to simply it with no duplicate line as file... (5 Replies)
Discussion started by: Teh Tiack Ein
5 Replies
Login or Register to Ask a Question