Deleting duplicate records from file 1 if records from file 2 match


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Deleting duplicate records from file 1 if records from file 2 match
# 1  
Old 05-05-2012
Deleting duplicate records from file 1 if records from file 2 match

I have 2 files

"File 1" is delimited by ";" and "File 2" is delimited by "|".


File 1 below (3 record shown):
Code:
Doc1;03/01/2012;New York;6 Main Street;Mr. Smith 1;Mr. Jones
Doc2;03/01/2012;Syracuse;876 Broadway;John Davis;Barbara Lull
Doc3;03/01/2012;Buffalo;779 Old Windy Road;Charles O'Brien;Bill Rudd


File 2 below (4 records shown):
Code:
6 Main Street|New York
345 Tipp Road|Brewser
885 Peartree|Buffalo
779 Old Windy Road|Buffalo


"File 1" is faily small, "File 2" is huge.


My problem: Line by line I need to copare each record in "File 1", the 3rd field (city) and 4th field (address)against matching field data in "File 2", the 1st field (address) and 2nd field (city) to make sure that there are no record matches.

All records that do not match should be copied out or > redirected to a new file (the edited file). If there is a match then that record should not be copied out to the edited file.


In other words given the example data above from "File 1" and "File 2" the "new edited file" should look like this:
Code:
Doc2;03/01/2012;Syracuse;876 Broadway;John Davis;Barbara Lull

The other 2 files below would be discarded as records matched "File 2"
Code:
Doc1;03/01/2012;New York;6 Main Street;Mr. Smith 1;Mr. Jones
Doc3;03/01/2012;Buffalo;779 Old Windy Road;Charles O'Brien;Bill Rudd


I hope that is not too confusing. I know this can probably be done with awk but I am as rusty as the Titanic with coding and lucky I got as far as I did with this project. Many thanks to "agama" for helping out on the last issue!

Thanks in advance for any replies!


Art

Last edited by Franklin52; 05-05-2012 at 10:45 AM.. Reason: Please use code tags for data and code samples, thank you
# 2  
Old 05-05-2012
First you convert the separator in File1 to pipe, use tr or sed, it's fairly simple.

Then try this code :
Code:
awk -F"|" 'NR==FNR { s=$1FS$2; a[s] = $0; next }  ! a[$4FS$3] { print > "nonmatch.txt" }  ' file2 file1

Hope that helps

Regards
Peasant.
# 3  
Old 05-05-2012
peasant thanks so much for that! It worked perfectly!

What I did was as you suggested convert the ";" delimiters in the one file first to "|" to get a common delimiter as your code uses the -F"|" option:
Code:
cat FileThatNeedsConverting | sed 's/;/|/g' > ConvertedFile


I was going to ask how to see discarded data but a simple "diff" between the 2 files (original and nonmatch.txt) accomplishes that.
Code:
diff OriginalFile nonmatch.txt

Also by doing a:
Code:
wc -l OriginalFile

and a:
Code:
wc -l nonmatch.txt


you can see that records were shaved off. I just wanted to add that in the case that this may help someone else verify or similar project.


Many thanks man!Smilie


Art

Last edited by Franklin52; 05-05-2012 at 10:45 AM.. Reason: Please use code tags for data and code samples, thank you
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Deleting records from .dat file

Hi, I need to delete one row from a .dat file. error processing column PCT_USED in row 1053295 for datafile /exp/stats/ts_stats.dat ORA-01722: invalid number This is used to load records using sql loader. Please let me know the procedure to do it. Regards, VN (3 Replies)
Discussion started by: narayanv
3 Replies

2. UNIX for Dummies Questions & Answers

CSV file:Find duplicates, save original and duplicate records in a new file

Hi Unix gurus, Maybe it is too much to ask for but please take a moment and help me out. A very humble request to you gurus. I'm new to Unix and I have started learning Unix. I have this project which is way to advanced for me. File format: CSV file File has four columns with no header... (8 Replies)
Discussion started by: arvindosu
8 Replies

3. Shell Programming and Scripting

Find Duplicate records in first Column in File

Hi, Need to find a duplicate records on the first column, ANU4501710430989 0000000W20389390 ANU4501710430989 0000000W67065483 ANU4501130050520 0000000W80838713 ANU4501210170685 0000000W69246611... (3 Replies)
Discussion started by: Murugesh
3 Replies

4. Shell Programming and Scripting

Deleting Duplicate Records

Hello, I'm have a file of xy data with over 1000 records. I want to delete both x and y values for any record that has the same x value as any previous record thus removing the duplicates from my file. Can anyone help? Thanks, Dan (3 Replies)
Discussion started by: DFr0st
3 Replies

5. Shell Programming and Scripting

Deleting last records of a file

can you please give shell script for daleting the last 7 records of file... (7 Replies)
Discussion started by: vsairam
7 Replies

6. Shell Programming and Scripting

find out duplicate records in file?

Dear All, I have one file which looks like : account1:passwd1 account2:passwd2 account3:passwd3 account1:passwd4 account5:passwd5 account6:passwd6 you can see there're two records for account1. and is there any shell command which can find out : account1 is the duplicate record in... (3 Replies)
Discussion started by: tiger2000
3 Replies

7. Shell Programming and Scripting

compare fields in a file with duplicate records

Hi: I've been searching the net but didnt find a clue. I have a file in which, for some records, some fields coincide. I want to compare one (or more) of the dissimilar fields and retain the one record that fulfills a certain condition. For example, on this file: 99 TR 1991 5 06 ... (1 Reply)
Discussion started by: rleal
1 Replies

8. Shell Programming and Scripting

How to find Duplicate Records in a text file

Hi all pls help me by providing soln for my problem I'm having a text file which contains duplicate records . Example: abc 1000 3452 2463 2343 2176 7654 3452 8765 5643 3452 abc 1000 3452 2463 2343 2176 7654 3452 8765 5643 3452 tas 3420 3562 ... (1 Reply)
Discussion started by: G.Aavudai
1 Replies

9. UNIX for Advanced & Expert Users

Duplicate records from oracle to text file.

Hi, I want to fetch duplicate records from an external table to a text file. Pls suggest me. Thanks (1 Reply)
Discussion started by: shilendrajadon
1 Replies

10. Shell Programming and Scripting

Remove all instances of duplicate records from the file

Hi experts, I am new to scripting. I have a requirement as below. File1: A|123|NAME1 A|123|NAME2 B|123|NAME3 File2: C|123|NAME4 C|123|NAME5 D|123|NAME6 1) I have 2 merge both the files. 2) need to do a sort ( key fields are first and second field) 3) remove all the instances... (3 Replies)
Discussion started by: vukkusila
3 Replies
Login or Register to Ask a Question