Using uniq on log files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Using uniq on log files
# 1  
Old 06-03-2010
Using uniq on log files

I have this log file which I need to count the number of repeated line and do some manipulation.

Code:
test.log:
June  3 03:33:38 test 1
June  3 10:31:22 test 2
June  3 10:32:22 test 2
June  3 10:33:22 test 3
June  3 10:33:22 test 3
June  3 10:34:22 test 4
June  3 10:35:22 test 5

Code:
Applied: sort +0.16 test.log | uniq -c +16 | sort -n

and achieved:

Code:
   1 June  3 10:30:22 test 1
   1 June  3 10:34:22 test 4
   1 June  3 10:35:22 test 5
   2 June  3 10:31:22 test 2
   2 June  3 10:33:22 test 3

Now I'm trying to achieve the following where I only need to show the number of counter for msg that had repeated more than twice:

Code:
Msg: June  3 10:30:22 test 1
Msg: June  3 10:34:22 test 4
Msg: June  3 10:35:22 test 5
Repeated count: 2 Msg: June  3 10:31:22 test 2
Repeated count: 2 Msg: June  3 10:33:22 test 3

how do I cut and put together them together using AWK and SED? Any help or advice is much appreciated
# 2  
Old 06-03-2010
Quote:
Msg: June 3 10:30:22 test 1
Msg: June 3 10:34:22 test 4
Msg: June 3 10:35:22 test 5
Repeated count: 2 Msg: June 3 10:31:22 test 2
Repeated count: 2 Msg: June 3 10:33:22 test 3
Is that your file now?
what is the output you are expecting?
# 3  
Old 06-03-2010
This is my file now:
Code:
 
   1 June  3 10:30:22 test 1
   1 June  3 10:34:22 test 4
   1 June  3 10:35:22 test 5
   2 June  3 10:31:22 test 2
   2 June  3 10:33:22 test 3

And this is my expected output:

Code:
Msg: June 3 10:30:22 test 1
Msg: June 3 10:34:22 test 4
Msg: June 3 10:35:22 test 5
Repeated count: 2 Msg: June 3 10:31:22 test 2
Repeated count: 2 Msg: June 3 10:33:22 test 3

I need to show the number of counter for msg that had repeated more than twice and tag each line.
# 4  
Old 06-03-2010
Quote:
Originally Posted by jazzaddict
This is my file now:
Code:
 
   1 June  3 10:30:22 test 1
   1 June  3 10:34:22 test 4
   1 June  3 10:35:22 test 5
   2 June  3 10:31:22 test 2
   2 June  3 10:33:22 test 3

And this is my expected output:

Code:
Msg: June 3 10:30:22 test 1
Msg: June 3 10:34:22 test 4
Msg: June 3 10:35:22 test 5
Repeated count: 2 Msg: June 3 10:31:22 test 2
Repeated count: 2 Msg: June 3 10:33:22 test 3

I need to show the number of counter for msg that had repeated more than twice and tag each line.
Code:
$
$
$ cat f0
   1 June  3 10:30:22 test 1
   1 June  3 10:34:22 test 4
   1 June  3 10:35:22 test 5
   2 June  3 10:31:22 test 2
   2 June  3 10:33:22 test 3
$
$
$ awk '{if ($1 > 1){print "Repeated count: "$1" Msg: "$2,$3,$4,$5,$6} else{print "Msg: "$2,$3,$4,$5,$6}}' f0
Msg: June 3 10:30:22 test 1
Msg: June 3 10:34:22 test 4
Msg: June 3 10:35:22 test 5
Repeated count: 2 Msg: June 3 10:31:22 test 2
Repeated count: 2 Msg: June 3 10:33:22 test 3
$
$

tyler_durden
# 5  
Old 06-03-2010
Code:
awk '
END{
	for(i in b)
	{
		if(b[i]==1)
			{	
			print c[i]
			}
		else	{	
			x="Repeated count: "b[i]" Msg: "c[i]
			y=(y)?y RS x:x
			}
	}print y
}
{	a=$(NF-1)$NF
	b[a]++
	c[a]=(c[a])?c[a]:$0
}' in.file

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Redirecting log files to null writing junk into log files

Redirecting log files to null writing junk into log files. i have log files which created from below command exec <processname> >$logfile but when it reaches some size i am redirecting to null while process is running like >$logfile manually but after that it writes some junk into... (7 Replies)
Discussion started by: greenworld123
7 Replies

2. Shell Programming and Scripting

Combine data from two files base on uniq data

File 1 ID Name Po1 Po2 DD134 DD134_4A_1 NN-1 L_0_1 DD134 DD134_4B_1 NN-2 L_1_1 DD134 DD134_4C_1 NN-3 L_2_1 DD142 DD142_4A_1 NN-1 L_0_1 DD142 DD142_4B_1 NN-2 L_1_1 DD142 DD142_4C_1 NN-3 L_2_1 DD142 DD142_3A_1 NN-41 L_3_1 DD142 DD142_3A_1 NN-42 L_3_2 File 2 ( Combination of... (1 Reply)
Discussion started by: pareshkp
1 Replies

3. Shell Programming and Scripting

Joining three files with uniq no.

Hi I have three files which looks like as below with first 11byte is unique as a serial no.(eg.0X000000001,0X000000002,0X000000003) File1 0X000000001|00XXX|D3|F2|C3|A6|1| 0X000000002|00XXX|E3|G2|D3|B6|1| 0X000000003|00XXX|F3|H2|E3|C6|1| File2 0X000000001|00XXX|D3|F2|... (7 Replies)
Discussion started by: lathigara
7 Replies

4. Shell Programming and Scripting

[uniq + awk?] How to remove duplicate blocks of lines in files?

Hello again, I am wanting to remove all duplicate blocks of XML code in a file. This is an example: input: <string-array name="threeItems"> <item>item1</item> <item>item2</item> <item>item3</item> </string-array> <string-array name="twoItems"> <item>item1</item> <item>item2</item>... (19 Replies)
Discussion started by: raidzero
19 Replies

5. Shell Programming and Scripting

How can view log messages between two time frame from /var/log/message or any type of log files

How can view log messages between two time frame from /var/log/message or any type of log files. when logfiles are very big and especially many messages with in few minutes, I would like to display log messages between 5 minute interval. Could you pls give me the command? (1 Reply)
Discussion started by: johnveslin
1 Replies

6. UNIX for Dummies Questions & Answers

Finding and Extracting uniq data in multiple files

Hi, I have several files that look like this: File1.txt Data1 Data2 Data20 File2.txt Data1 Data5 Data10 File3.txt Data1 Data2 Data17 File4.txt (6 Replies)
Discussion started by: Fahmida
6 Replies

7. Shell Programming and Scripting

Modify log files to get uniq data

Hello, I have a log file that has following output as below. LAP.sun5 CC LAP.sun5 CQ perl.sun5 CC perl.sun5 CQ TSLogger.sun5 CC TSLogger.sun5 CQ TSLogger.sun5 KR WAS.sun5 CC WAS.sun5 MT WAS.sun5 CQ I want to output to be in the way below, i tried using awk but could not do it. ... (12 Replies)
Discussion started by: asirohi
12 Replies

8. UNIX for Dummies Questions & Answers

Difference between plain "uniq" and "uniq -u"

Dear all, It's not entirely clear to me from manpage the difference between them. Why we still need "-u" flag? - monkfan (3 Replies)
Discussion started by: monkfan
3 Replies

9. Shell Programming and Scripting

Compare 2 files and give uniq output

Hi , Just to find out a way to compare these 2 files and give unique output. For eg: 1.txt contains 1 2 3 4 5 6 -------------------------------------- 2.txt contains 1 2 6 8 (1 Reply)
Discussion started by: rauphelhunter
1 Replies

10. Shell Programming and Scripting

compare two col from 2 files, and output uniq from file 1

Hi, I can't find how to achive such thing, please help. I have try with uniq and comm but those command can't compare columns just whole lines, I think awk will be the best but awk is magic for me as of now. file a a1~a2~a3~a4~a6~a7~a8 file b b1~b2~b3~b4~b6~b7~b8 output 1: compare... (2 Replies)
Discussion started by: pp56825
2 Replies
Login or Register to Ask a Question