Sort, duplicate removal - Query


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Sort, duplicate removal - Query
# 1  
Old 04-03-2008
Sort, duplicate removal - Query

Hi All,

I have a problem with the sort and duplicate filter command I am using in one of my scripts. I have a '|' delimited file and want to sort and remove duplicates on the 1,2,15 fields. These fields constitute the primary key of the table I will be loading the data into. But I see that some duplicates are being filtered, but the rest are not. I am wondering why this is happening. Could somebody help me out with this please.

Thanks!!!

The command -

sort -t "|" +0 -1 +1 -2 +14 -u unsorted_input.dat > sorted_dup_output.dat
# 2  
Old 04-03-2008
Question Show input

Can you post what the unsorted input data looks like.
# 3  
Old 04-03-2008
Please post a sample of the file if this doesn't work:

Look into the -k option to sort. It will allow you to sort by specified fields.

However, I have never tried this together with the -u flag, so I'm curious as to whether it will not only sort on the -k fields, but also uniq on them...

ShawnMilo
# 4  
Old 04-03-2008
Hi,

Here is the unsorted Input

2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000136|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000136|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000137|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000141|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000142|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000143|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000137|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000138|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000138|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000139|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000139|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000140|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000140|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000141|CR


I want to sort on the 1,2,4 fields and remove duplicates on these.

So, I give -

sort -t "|" +0 -1 +1 -2 +3 -u dup_chk_inp.dat > dup_chk_otpt.dat

The Output -

2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000136|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000137|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000138|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000138|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000139|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000139|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000140|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000140|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000141|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000141|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000142|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000143|DR

It has worked for the first 2 rows, but not for the rest. It is taking into account the 5th field also for duplicate removal.

Any ideas on how to fix this?

Thanks!!!
# 5  
Old 04-03-2008
Code:
cat file
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000136|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000136|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000137|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000141|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000142|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000143|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000137|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000138|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000138|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000139|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000139|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000140|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000140|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000141|CR

Code:
sort -t"|" -k1,2 -k4,4 -u file

2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000136|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000137|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000138|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000139|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000140|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|1|999001000141|CR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000142|DR
2008-04-01 10:55:01|2008-04-01 14:19:42|23|999001000143|DR

 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

DB2 Query modification to remove duplicate values using LISTAGG function

I am using DB2 v9 and trying to get country values in comma seperated format using below query SELECT distinct LISTAGG(COUNTRIES, ',') WITHIN GROUP(ORDER BY EMPLOYEE) FROM LOCATION ; Output Achieved MEXICO,UNITED STATES,INDIA,JAPAN,UNITED KINGDOM,MEXICO,UNITED STATES The table... (4 Replies)
Discussion started by: Perlbaby
4 Replies

2. Shell Programming and Scripting

Honey, I broke awk! (duplicate line removal in 30M line 3.7GB csv file)

I have a script that builds a database ~30 million lines, ~3.7 GB .cvs file. After multiple optimzations It takes about 62 min to bring in and parse all the files and used to take 10 min to remove duplicates until I was requested to add another column. I am using the highly optimized awk code: awk... (34 Replies)
Discussion started by: Michael Stora
34 Replies

3. UNIX for Dummies Questions & Answers

Sort and delete partical duplicate file

I want to delete partical duplicate file >gma-miR156d Gm01,PACID=26323927 150.00 -18.28 2 18 17 35 16 75.00% 81.25% >>gma-miR156d Gm01,PACID=26323927 150.00 -18.28 150.00 -18.28 1 21 119 17 I want to order by the second column and delete the... (1 Reply)
Discussion started by: grace_shen
1 Replies

4. UNIX for Advanced & Expert Users

Duplicate removal

I have an input file of 5GB which contains duplicate records and have to remove duplicate records by retaing first instance of that record . Based on 5 fields the duplicates has to be removed . Kindly request to help me in writing a Unix Script. Thanks Asim (11 Replies)
Discussion started by: duplicate
11 Replies

5. Shell Programming and Scripting

Duplicate line removal matching some columns only

I'm looking to remove duplicate rows from a CSV file with a twist. The first row is a header. There are 31 columns. I want to remove duplicates when the first 29 rows are identical ignoring row 30 and 31 BUT the duplicate that is kept should have the shortest total character length in rows 30... (6 Replies)
Discussion started by: Michael Stora
6 Replies

6. Shell Programming and Scripting

Sort and Remove Duplicate on file

How do we sort and remove duplicate on column 1,2 retaining the record with maximum date (in feild 3) for the file with following format. aaa|1234|2010-12-31 aaa|1234|2010-11-10 bbb|345|2011-01-01 ccc|346|2011-02-01 bbb|345|2011-03-10 aaa|1234|2010-01-01 Required Output ... (5 Replies)
Discussion started by: mabarif16
5 Replies

7. Shell Programming and Scripting

sort and semi-duplicate row - keep latest only

I have a pipe delimited file. Key is field 2, date is field 5 (as example, my real file is more complicated of course, but the KEY and DATE are accurate) There can be duplicate rows for a key with different dates. I need to keep only rows with latest date in this case. Example data: ... (4 Replies)
Discussion started by: LisaS
4 Replies

8. Shell Programming and Scripting

Removal of Duplicate Entries from the file

I have a file which consists of 1000 entries. Out of 1000 entries i have 500 Duplicate Entires. I want to remove the first Duplicate Entry (i,e entire Line) in the File. The example of the File is shown below: 8244100010143276|MARISOL CARO||MORALES|HSD768|CARR 430 KM 1.7 ... (1 Reply)
Discussion started by: ravi_rn
1 Replies

9. Shell Programming and Scripting

How to remove duplicate records with out sort

Can any one give me command How to delete duplicate records with out sort. Suppose if the records like below: 345,bcd,789 123,abc,456 234,abc,456 712,bcd,789 out tput should be 345,bcd,789 123,abc,456 Key for the records is 2nd and 3rd fields.fields are seperated by colon(,). (19 Replies)
Discussion started by: svenkatareddy
19 Replies

10. Solaris

How to remove duplicate records with out sort

Can any one give me command How to delete duplicate records with out sort. Suppose if the records like below: 345,bcd,789 123,abc,456 234,abc,456 712,bcd,789 out tput should be 345,bcd,789 123,abc,456 Key for the records is 2nd and 3rd fields.fields are seperated by colon(,). (2 Replies)
Discussion started by: svenkatareddy
2 Replies
Login or Register to Ask a Question