Remove duplicates from a file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Remove duplicates from a file
# 1  
Old 01-21-2010
Remove duplicates from a file

Hi,

I need to remove duplicates from a file. The file will be like this
Code:
0003 10101 20100120 abcdefghi
0003 10101 20100121 abcdefghi
0003 10101 20100122 abcdefghi
0003 10102 20100120 abcdefghi
0003 10103 20100120 abcdefghi
0003 10103 20100121 abcdefghi

Here if the first colum and second column is repeating i need to pick the first record. If not repeating i need to pick the record.

The output shd be.

Code:
0003 10101 20100120 abcdefghi
0003 10102 20100120 abcdefghi
0003 10103 20100120 abcdefghi

Thanks in advance for the help. The script could be in Perl or Unix.
# 2  
Old 01-21-2010
If the values are ordered, their format is fixed and the uniq implementation on your platform supports the w option:

Code:
uniq -w10 infile

Otherwise use awk:

Code:
awk '!_[$1,$2]++' infile

On Solaris you should use gawk, nawk or /usr/xpg4/bin/awk.

Or Perl:

Code:
perl -ane'print unless $_{$F[0], $F[1]}++' infile


Last edited by radoulov; 01-28-2010 at 05:24 AM.. Reason: corrected
# 3  
Old 01-22-2010
Code:
 sort -u -k1,2 infile

# 4  
Old 01-22-2010
Hello,
i found this code on a web page which is said to be valid for only gnu linux and delete all lines except duplicate ones, i hope it works (sorry im using solaris10, couldnt try)

# delete all lines except duplicate lines (emulates "uniq -d").
Code:
sed '$!N; s/^\(.*\)\n\1$/\1/; t; D' infile

# 5  
Old 01-27-2010
Could you explain the perl code please?

Quote:
Originally Posted by radoulov
If the values are ordered, their format is fixed and the uniq implementation on your platform supports the w option:

Code:
uniq -w10 infile

Otherwise use awk:

Code:
awk '!_[$1,$2]++' infile

On Solaris you should use gawk, nawk or /usr/xpg4/bin/awk.

Or Perl:

Code:
perl -ane'print unless $_{@F[0..1]}++' infile

# 6  
Old 01-28-2010
Quote:
Originally Posted by gpaulose
Could you explain the perl code please?
Yes,
first of all, the code is wrong Smilie
It should be:

Code:
unless $_{$F[0],$F[1]}++

... not:

Code:
unless $_{@F[0..1]}++

So the script becomes:

Code:
perl -ane'print unless $_{$F[0],$F[1]}++' infile

First the command line switches:

Quote:
-a turns on autosplit mode when used with a -n or -p. An implicit
split command to the @F array is done as the first thing inside
the implicit while loop produced by the -n or -p.
Quote:
-e commandline
may be used to enter one line of program. If -e is given, Perl
will not look for a filename in the argument list. Multiple -e
commands may be given to build up a multi-line script. Make sure
to use semicolons where you would in a normal program.
Quote:
-n causes Perl to assume the following loop around your program,
which makes it iterate over filename arguments somewhat like sed
-n or awk:

LINE:
while (<>) {
... # your program goes here
}

Note that the lines are not printed by default. See -p to have
lines printed. If a file named by an argument cannot be opened
for some reason, Perl warns you about it and moves on to the next
file.
So we have the input file read line by line and the @F array automatically populated.

Code:
print unless $_{$F[0],$F[1]}++

Print the current record unless the expression $_{$F[0],$F[1]}++ returns true in boolean context. We build the hash %_ whose keys (the concatenation of the first two fields with the subscript separator) are associated with auto-incremented integers. When we see a given key ($F[0] $; $F[1] - the first and the second fields) for the first time, because of the post-incrementing (k++ and not ++k ) its value is 0, i.e. false, so it prints the record.

This will make the concept clear:

Code:
% perl -lane'print  $_ ," -> ", $_{$F[0],$F[1]}++' infile
0003 10101 20100120 abcdefghi -> 0
0003 10101 20100121 abcdefghi -> 1
0003 10101 20100122 abcdefghi -> 2
0003 10102 20100120 abcdefghi -> 0
0003 10103 20100120 abcdefghi -> 0
0003 10103 20100121 abcdefghi -> 1

We want only the records with value 0.

Hope this helps.
# 7  
Old 01-28-2010
another approach in perl:-

Code:
perl -ane ' ! $_{$F[0],$F[1]}++ and print ' infile.txt

perl -ane ' ! $_{$F[0],$F[1]}++ && print '  infile.txt

perl -ane ' print while ! $_{$F[0],$F[1]}++ '  infile.txt

SmilieSmilieSmilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Remove duplicates in flat file

Hi all, I have a issues while loading a flat file to the DB. It is taking much time. When analyzed i found out that there are duplicates entry in the flat file. There are 2 type of Duplicate entry. 1) is entire row is duplicate. ( i can use sort | uniq) to remove the duplicated entry. 2) the... (4 Replies)
Discussion started by: samjoshuab
4 Replies

2. Shell Programming and Scripting

To remove duplicates from pipe delimited file

Hi some one please help me to remove duplicates from a pipe delimited file based on first two columns. 123|asdf|sfsd|qwrer 431|yui|qwer|opws 123|asdf|pol|njio Here My first record and last record are duplicates.As per my requirement I want all the latest records into one file. I want the... (12 Replies)
Discussion started by: ginrkf
12 Replies

3. UNIX for Dummies Questions & Answers

Remove duplicates and keep them in a separate file

Hi, I have a tablular separated file and I want to remove all the rows that have duplicates. The diuplicates I need to check are in column 13. I have tried to use awk but I have no Idea how to keep the duplicate file. awk 'FNR==NR{a++;next}(a> 1)' tomodify.txt tomodify.txt > new.txt ... (4 Replies)
Discussion started by: flacchy
4 Replies

4. UNIX for Dummies Questions & Answers

Remove duplicates from a file

Can u tell me how to remove duplicate records from a file? (11 Replies)
Discussion started by: saga20
11 Replies

5. Shell Programming and Scripting

How to remove duplicates from the .dat file

All, I have a file 1181CUSTOMER-L061411_003500.dat.Z having duplicate records in it. bash-2.05$ zcat 1181CUSTOMER-L061411_003500.dat.Z|grep "90876251S" 90876251S|ABG, AN ADAYANA COMPANY|3550 DEPAUW BLVD|||US|IN|INDIANAPOLIS||DAL|46268||||||GEN|||||||USD|||ABG, AN ADAYANA... (3 Replies)
Discussion started by: Oracle_User
3 Replies

6. Shell Programming and Scripting

Search based on 1,2,4,5 columns and remove duplicates in the same file.

Hi, I am unable to search the duplicates in a file based on the 1st,2nd,4th,5th columns in a file and also remove the duplicates in the same file. Source filename: Filename.csv "1","ccc","information","5000","temp","concept","new" "1","ddd","information","6000","temp","concept","new"... (2 Replies)
Discussion started by: onesuri
2 Replies

7. Shell Programming and Scripting

Remove duplicates from end of file

1/p ---- A B C A C o/p --- B A C From input file it should remove duplicates from end without changing order (5 Replies)
Discussion started by: lavnayas
5 Replies

8. Shell Programming and Scripting

Shell script to remove duplicates lines in a file

Hi, I am writing a shell script that needs to remove duplicate lines within a file by category. example: section a a c b a section b a b a c I need to remove the duplicates within th category with out removing the duplicates from the 2 different sections (one of the a's in section... (1 Reply)
Discussion started by: RichElks
1 Replies

9. Shell Programming and Scripting

remove duplicates within a block in a file..help required

hi.. i have a file in the following format :- name-a age -12 address-123 age-12 phone-22222 ============ name-ab age -11 address-123 age-11 phone-222223 ============= name-abc age -12 address-1234 age-12 phone-2222223 ============= (2 Replies)
Discussion started by: nipun_garg
2 Replies

10. Shell Programming and Scripting

Remove duplicates from File from specific location

How can i remove the duplicate lines from a file, for example sample123456Sample testing123456testing XXXXX131323XXXXX YYYYY423432YYYYY fsdfdsf123456gsdfdsd all the duplicates from column 6-12 , must be deleted. I want to consider the first row, if same comes in the given range i want to... (1 Reply)
Discussion started by: gopikgunda
1 Replies
Login or Register to Ask a Question