04-25-2007
I think it can work by this way also:
sort file1 |uniq -d > file2
this will extract the duplicated lines and redirects them to file2
you can also specify the number of duplication of every line at the begining of the line:
sort file1 |uniq -c > file2
10 More Discussions You Might Find Interesting
1. UNIX for Advanced & Expert Users
Hi,
I have a file with duplicate lines in it. I want to keep only the duplicate lines and delete the non duplicates. Can some one please help me?
Regards
Narayana Gupta (3 Replies)
Discussion started by: guptan
3 Replies
2. UNIX for Dummies Questions & Answers
I have a log file "logreport" that contains several lines as seen below:
04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping
06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping
07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies
3. Shell Programming and Scripting
Hi,
I need to concatenate three files in to one destination file.In this if some duplicate data occurs it should be deleted.
eg:
file1:
-----
data1 value1
data2 value2
data3 value3
file2:
-----
data1 value1
data4 value4
data5 value5
file3:
-----
data1 value1
data4 value4 (3 Replies)
Discussion started by: Sharmila_P
3 Replies
4. AIX
I have setof files with data and with same fields multiple times in each of the files.
for example:
file 1
name = mary kate
last name = kate
address = 123
street = abc
name = mary mark
last name = mark
address = 456
street = bcd
file 2
name = mary kate
last name = kate... (2 Replies)
Discussion started by: relearner
2 Replies
5. Shell Programming and Scripting
Hi All,
I am trying to remove the duplicate entries in a file and print them just once. For example, if my input file has:
00:44,37,67,56,15,12
00:44,34,67,56,15,12
00:44,58,67,56,15,12
00:44,35,67,56,15,12
00:59,37,67,56,15,12
00:59,34,67,56,15,12
00:59,35,67,56,15,12... (7 Replies)
Discussion started by: faiz1985
7 Replies
6. Shell Programming and Scripting
Hi,
I have a file with a size of 10 mb and I need to redirect some specific lines to a new files. For eg.
Executed Restore for 11227.EDCS.551.01.201110 from /tmp/bk/restore/CR81500/content/S24U15VA2.2010-10-29.16:49.EDT/ArchiveFile_11227.EDCS.551.01.201110.zip
Operation output:
<U+FEFF>Oct... (4 Replies)
Discussion started by: gsiva
4 Replies
7. UNIX for Advanced & Expert Users
Hi All,
I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space.
I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies
8. Shell Programming and Scripting
hello all
in my bash script I have a file and I only want to keep the lines that appear twice in the file.Is there a way to do this?
thanks in advance! (4 Replies)
Discussion started by: vlm
4 Replies
9. UNIX for Dummies Questions & Answers
I have a file with following data
A
B
C
I would like to print like this n times(For eg:5 times)
A
B
C
A
B
C
A
B
C
A
B
C
A (7 Replies)
Discussion started by: nsuresh316
7 Replies
10. Shell Programming and Scripting
Hi,
I have a csv file which contains some millions of lines in it.
The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line).
I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies