Sort, Uniq, Duplicates


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Sort, Uniq, Duplicates
# 1  
Old 05-16-2007
Sort, Uniq, Duplicates

Input File is :
-------------
25060008,0040,03,
25136437,0030,03,
25069457,0040,02,
80303438,0014,03,1st
80321837,0009,03,1st
80321977,0009,03,1st
80341345,0007,03,1st
84176527,0047,03,1st
84176527,0047,03,
20000735,0018,03,1st
25060008,0040,03,

I am using the following in the script :
------------------------------------
cat InputFile | sort -t, -k1,2 | uniq -d > "Duplicates"

This gets 25060008,0040,03, into the Duplicates file.
But I also want 84176527,0047,03, in the Duplicates file.

Basically I want the script to sort on the first 2 fields (delimited by comma) and if duplicates are found for first 2 fields I want it to be written to "Duplicates" file.

Please guide.
# 2  
Old 05-16-2007
Try that:
Code:
sort -t, -k1,2 InputFile | awk -F, '{ if ((key=$1 "," $2)==prv_key) print; prv_key=key}' > "Duplicates"

Jean-Pierre.
# 3  
Old 05-16-2007
25060008,0040,03,


this is the only line that is duplicate
# 4  
Old 05-16-2007
Quote:
This gets 25060008,0040,03, into the Duplicates file.
But I also want 84176527,0047,03, in the Duplicates file.

Basically I want the script to sort on the first 2 fields (delimited by comma) and if duplicates are found for first 2 fields I want it to be written to "Duplicates" file.

In the above sample of records only the third field is common '03'
and not the first or the second field.

How would you expect that to be termed as duplicates based on two fields ? Smilie
# 5  
Old 05-17-2007
Sort, Uniq, Duplicates

Hi MatrixMadhan,
Please look at the inputfile :
84176527,0047,03,1st
84176527,0047,03,
Is a duplicate record if I want to sort on 1st and 2nd field.

I sorted the issue with :
cat inputfile | sort -t -k1,2 -u > unq
cat inputfile | sort -t -k1,2 > non-unq
comm -23 non-unq unq > duplicates

MatrixMadhan, Jean-Pierre : Thanks.

Thanks.
# 6  
Old 05-17-2007
Code:
awk -F"," '{ line[$1.$2] = $0
             arr[$1.$2]++
           }
END{     for (i in arr) {
            if ( arr[i] > 1 ){
	       print line[i] > "duplicates"
	    }
	 } 
 }' file

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Uniq and sort -u

Hello all, Need to pick your brains, I have a 10Gb file where each row is a name, I am expecting about 50 names in total. So there are a lot of repetitions in clusters. So I want to do a sort -u file Will it be considerably faster or slower to use a uniq before piping it to sort... (3 Replies)
Discussion started by: senhia83
3 Replies

2. Shell Programming and Scripting

Uniq or sort -u or similar only between { }

Hi ! I am trying to remove doubbled entrys in a textfile only between delimiters. Like that example but i dont know how to do that with sort or similar. input: { aaa aaa } { aaa aaa } output: { aaa } { (8 Replies)
Discussion started by: fugitivus
8 Replies

3. Shell Programming and Scripting

Sort uniq or awk

Hi again, I have files with the following contents datetime,ip1,port1,ip2,port2,number How would I find out how many times ip1 field shows up a particular file? Then how would I find out how many time ip1 and port 2 shows up? Please mind the file may contain 100k lines. (8 Replies)
Discussion started by: LDHB2012
8 Replies

4. Shell Programming and Scripting

Sort and uniq after comparision

Hi All, I have a text file with the format shown below. Some of the records are duplicated with the only exception being date (Field 15). I want to compare all duplicate records using subscriber number (field 7) and keep only those records with greater date. ... (1 Reply)
Discussion started by: nua7
1 Replies

5. Shell Programming and Scripting

sort | uniq question

Hello, I have a large data file: 1234 8888 bbb 2745 8888 bbb 9489 8888 bbb 1234 8888 aaa 4838 8888 aaa 3977 8888 aaa I need to remove duplicate lines (where the first column is the duplicate). I have been using: sort file.txt | uniq -w4 > newfile.txt However, it seems to keep the... (11 Replies)
Discussion started by: palex
11 Replies

6. Shell Programming and Scripting

Help with Uniq and sort

The key is first field i want only uniq record for the first field in file. I want the output as or output as Appreciate help on this (4 Replies)
Discussion started by: pinnacle
4 Replies

7. Shell Programming and Scripting

Removing duplicates [sort , uniq]

Hey Guys, I have file which looks like this, Contig201#numbPA Contig1452#nmdynD6PA dm022p15.r#CG6461PA dm005e16.f#SpatPA IGU001_0015_A06.f#CG17593PA I need to remove duplicates based on the chracter matching upto '#'. for example if we consider this.. Contig201#numbPA... (4 Replies)
Discussion started by: sharatz83
4 Replies

8. Shell Programming and Scripting

sort and uniq in perl

Does anyone have a quick and dirty way of performing a sort and uniq in perl? How an array with data like: this is bkupArr BOLADVICE_VN this is bkupArr MLT6800PROD2A this is bkupArr MLT6800PROD2A this is bkupArr BOLADVICE_VN_7YR this is bkupArr MLT6800PROD2A I want to sort it... (4 Replies)
Discussion started by: reggiej
4 Replies

9. UNIX for Dummies Questions & Answers

Help with Last,uniq, sort and cut

Using the last, uniq, sort and cut commands, determine how many times the different users have logged in. I know how to use the last command and cut command... i came up with last | cut -f1 -d" " | uniq i dont know if this is right, can someone please help me... thanks (1 Reply)
Discussion started by: jay1228
1 Replies

10. UNIX for Dummies Questions & Answers

sort/uniq

I have a file: Fred Fred Fred Jim Fred Jim Jim If sort is executed on the listed file, shouldn't the output be?: Fred Fred Fred Fred Jim Jim Jim (3 Replies)
Discussion started by: jimmyflip
3 Replies
Login or Register to Ask a Question