Delete Duplicate records from a tilde delimited file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Delete Duplicate records from a tilde delimited file
# 1  
Old 12-05-2007
Delete Duplicate records from a tilde delimited file

Hi All,

I want to delete duplicate records from a tilde delimited file. Criteria is considering the first 2 fields, the combination of which has to be unique, below is a sample of records in the input file

1620000010338~2446694087~0~20061130220000~A00BCC1CT
1620000126196~2446694087~0~20061130220000~A00BCC1CT
1620000126196~2446694087~1~20061430220000~A00BCC1CT
1620000127475~2446694087~0~20061130220000~A00BCC1CT
1620000134743~2446694087~0~20061130220000~A00BCC1CT
1620000134743~2446694087~0~20060930220000~A00BCC1CT

here we notice that record 3 and 6 are duplicate records. let me how to do this in shell script.

Thanks in Advance
# 2  
Old 12-05-2007
Code:
 awk '!x[$1,$2]++' FS="~" filename

Use nawk or /usr/xpg4/bin/awk on Solaris.
# 3  
Old 12-05-2007
Code:
nawk -F'~' '!a[$1,$2]++' myFile

# 4  
Old 12-05-2007
And another one (if sorting is acceptable):

Code:
sort -ut~ -k1,2 filename


Last edited by radoulov; 12-05-2007 at 12:51 PM.. Reason: ...
# 5  
Old 12-05-2007
in perl

Code:
perl -e ' while (<>) { chomp; my @arr = split(/~/); $fileHash{$arr[0].$arr[1]} = $_  } foreach my $k ( keys %fileHash) { print $fileHash{$k} . "\n" } ' filename

# 6  
Old 12-06-2007
Thanks a lot guys for your reply, your replies helped me a lot
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Wanted best way to validate delimited file records

actually i post about this issue before but many folkz miss-understood with my quesion, We are checking for the delimited file records validation Delimited file will have data like this: Aaaa|sdfhxfgh|sdgjhxfgjh|sdgjsdg|sgdjsg| Aaaa|sdfhxfgh|sdgjhxfgjh|sdgjsdg|sgdjsg|... (3 Replies)
Discussion started by: Seshendranath
3 Replies

2. Shell Programming and Scripting

Deleting duplicate records from file 1 if records from file 2 match

I have 2 files "File 1" is delimited by ";" and "File 2" is delimited by "|". File 1 below (3 record shown): Doc1;03/01/2012;New York;6 Main Street;Mr. Smith 1;Mr. Jones Doc2;03/01/2012;Syracuse;876 Broadway;John Davis;Barbara Lull Doc3;03/01/2012;Buffalo;779 Old Windy Road;Charles... (2 Replies)
Discussion started by: vestport
2 Replies

3. Shell Programming and Scripting

Remove somewhat Duplicate records from a flat file

I have a flat file that contains records similar to the following two lines; 1984/11/08 7 700000 123456789 2 1984/11/08 1941/05/19 7 700000 123456789 2 The 123456789 2 represents an account number, this is how I identify the duplicate record. The ### signs represent... (4 Replies)
Discussion started by: jolney
4 Replies

4. UNIX for Dummies Questions & Answers

Extract records by column value - file non-delimited

the data in my file is has no delimiters. it looks like this: H52082320024740010PH333200612290000930 0.0020080131 D5208232002474000120070306200703060580T1502 TT 1.00 H52082320029180003PH333200702150001 30 100.0020080205 D5208232002918000120070726200707260580T1502 ... (3 Replies)
Discussion started by: jclanc8
3 Replies

5. Shell Programming and Scripting

find out duplicate records in file?

Dear All, I have one file which looks like : account1:passwd1 account2:passwd2 account3:passwd3 account1:passwd4 account5:passwd5 account6:passwd6 you can see there're two records for account1. and is there any shell command which can find out : account1 is the duplicate record in... (3 Replies)
Discussion started by: tiger2000
3 Replies

6. Shell Programming and Scripting

compare fields in a file with duplicate records

Hi: I've been searching the net but didnt find a clue. I have a file in which, for some records, some fields coincide. I want to compare one (or more) of the dissimilar fields and retain the one record that fulfills a certain condition. For example, on this file: 99 TR 1991 5 06 ... (1 Reply)
Discussion started by: rleal
1 Replies

7. Shell Programming and Scripting

How to delete duplicate records based on key

For example suppose I have a file which contains data as: $cat data 800,2 100,9 700,3 100,9 200,8 100,3 Now I want the output as 200,8 700,3 800,2 Key is first three characters, I don't want any reords which are having duplicate keys. Like sort +0.0 -0.3 data can we use... (9 Replies)
Discussion started by: sumitc
9 Replies

8. Shell Programming and Scripting

How to find Duplicate Records in a text file

Hi all pls help me by providing soln for my problem I'm having a text file which contains duplicate records . Example: abc 1000 3452 2463 2343 2176 7654 3452 8765 5643 3452 abc 1000 3452 2463 2343 2176 7654 3452 8765 5643 3452 tas 3420 3562 ... (1 Reply)
Discussion started by: G.Aavudai
1 Replies

9. UNIX for Advanced & Expert Users

Duplicate records from oracle to text file.

Hi, I want to fetch duplicate records from an external table to a text file. Pls suggest me. Thanks (1 Reply)
Discussion started by: shilendrajadon
1 Replies

10. Shell Programming and Scripting

how to extract a tilde delimited file in unix

i have a file in unix in which datas are like this 07 01 abc data entry Z3 data entry ASSISTANT Z3 39 08 01 POD peadiatrist Z4 POD PeDIATRY Z4 67 01 operator specialist 00 operator UNSPECIFIED A0 00 ... (12 Replies)
Discussion started by: trichyselva
12 Replies
Login or Register to Ask a Question