UNIX scripting for finding duplicates and null records in pk columns


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting UNIX scripting for finding duplicates and null records in pk columns
# 1  
Old 05-10-2014
UNIX scripting for finding duplicates and null records in pk columns

Hi,
I have a requirement.for eg: i have a text file with pipe symbol as delimiter(|) with 4 columns a,b,c,d. Here a and b are primary key columns..
i want to process that file to find the duplicates and null values are in primary key columns(a,b) . I want to write the unique records in which pks are not null into one file.. and the duplicate records,the records havin pk columns as null into another file.

sample.. input:abc.txt
Code:
a|b|c|d
11|55|ram|mgr
22||raj|celrk
33|10|sam|am
11|55|ram|mgr

ouput file 1 : unique records
Code:
11|55|ram|mgr
33|10|sam|am

ouput file 2 : duplicate records and records with pk columns null
Code:
 22||raj|celrk
11|55|ram|mgr

pls help me to achieve thisusing unix script.
Thanks

Last edited by Don Cragun; 05-10-2014 at 06:33 PM.. Reason: Add CODE tags.
# 2  
Old 05-10-2014
Is this a homework assignment?
# 3  
Old 05-11-2014
Nooo... its requirement in my project..
# 4  
Old 05-11-2014
Here is an awk approach:
Code:
awk -F\| '
        NR == 1 {
                next
        }
        {
                I = $1 OFS $2
                if ( ( I in U ) || !($1 && $2) )
                        print $0 > "dupl.txt"
        }
        $1 && $2 {
                U[I] = $0
        }
        END {
                for ( k in U )
                        print U[k] > "uniq.txt"
        }
' abc.txt

This program creates output files: dupl.txt and uniq.txt with duplicate and unique records.
# 5  
Old 05-11-2014
Hi praveenraj.1991,
Yoda's script may work fine for you, but your requirements are a little vague.

Can 0 be a key? If so, can 00 be a key? If so, are 0 and 00 distinct keys? (If the answer to any of these is yes, Yoda's script won't work for you.)

If you have lines:
Code:
11|55|ram|mgr
55|11|abc|def
11|10|efg|hij
33|10|sam|am

what should be the output? Or, more explicitly, does the order matter: are 11|55 and 55|11 duplicates keys? And, does each pair of keys have to be unique, or does each individual key have to be unique: are 55|11 and 11|30 duplicates because 11 is a common key? (If the answer to any of these is yes, Yoda's script won't work for you.)
# 6  
Old 05-11-2014
Simplifying a bit:
Code:
awk -F\| 'NR>1{print>(!A[$1,$2]++ && $1!="" && $2!=""?u:d)}' u=uniq.out d=dupl.out abc.txt


Last edited by Scrutinizer; 05-11-2014 at 05:27 AM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Finding duplicates then copying, almost there, maybe?

Hi everyone. I'm trying to help my wife with a project, she has exported 200 images from many different folders, unfortunately there was a problem with the export and I need to find the master versions so that she doesn't have to go through and select them again. I need to: For each image in... (2 Replies)
Discussion started by: Rhinoskin
2 Replies

2. Shell Programming and Scripting

finding duplicates in csv based on key columns

Hi team, I have 20 columns csv files. i want to find the duplicates in that file based on the column1 column10 column4 column6 coulnn8 coulunm2 . if those columns have same values . then it should be a duplicate record. can one help me on finding the duplicates, Thanks in advance. ... (2 Replies)
Discussion started by: baskivs
2 Replies

3. Shell Programming and Scripting

Help finding non duplicates

I am currently creating a script to find filenames that are listed once in an input file (find non duplicates). I then want to report those single files in another file. Here is the function that I have so far: function dups_filenames { file2="" file1="" file="" dn="" ch="" pn="" ... (6 Replies)
Discussion started by: chipblah84
6 Replies

4. Shell Programming and Scripting

Unix sort for fixed length columns and records

I was trying to use the AIX 6.1 sort command to sort fixed-length data records, sorting by specific columns only. It took some time to figure out how to get it to work, so I wanted to share the solution. The sort man page wasn't much help, because it talks about field delimeters (default space... (1 Reply)
Discussion started by: CheeseHead1
1 Replies

5. Shell Programming and Scripting

Awk to Count Records with not null

Hi, I have a pipe seperated file I want to write a code to display count of lines that have 20th field not null. nawk -F"|" '{if ($20!="") print NR,$20}' xyz..txt This displays records with 20th field also null. I would like output as: (4 Replies)
Discussion started by: pinnacle
4 Replies

6. Shell Programming and Scripting

exclude records with null fields

Hi, can I do something like this to add a condition of checking if the 4th field is number or space or blank also: awk -F, '$4 /^*||*/' MYFILE >> OTHERFILE I also want the other part i.e. I need to exclude all lines whose 4th field is space or blank or number: MYFILE a,b,c,d,e a,b,c,2,r... (2 Replies)
Discussion started by: praveenK_Dudala
2 Replies

7. Shell Programming and Scripting

Finding duplicates from positioned substring across lines

I have million's of records each containing exactly 50 characters and have to check the uniqueness of 4 character substring of 50 character (postion known prior) and report if any duplicates are found. Eg. data... AAAA00000000000000XXXX0000 0000000000... upto50 chars... (2 Replies)
Discussion started by: gapprasath
2 Replies

8. Shell Programming and Scripting

finding duplicates in columns and removing lines

I am trying to figure out how to scan a file like so: 1 ralphs office","555-555-5555","ralph@mail.com","www.ralph.com 2 margies office","555-555-5555","ralph@mail.com","www.ralph.com 3 kims office","555-555-5555","kims@mail.com","www.ralph.com 4 tims... (17 Replies)
Discussion started by: totus
17 Replies

9. Shell Programming and Scripting

finding null records in data file

I am having a "|" delimited flat file and I have to pick up all the records with the 2nd field having null value. Please suggest. (3 Replies)
Discussion started by: dsravan
3 Replies

10. Shell Programming and Scripting

finding duplicates with perl

I have a huge file (over 30mb) that I am processing through with perl. I am pulling out a list of filenames and placing it in an array called @reports. I am fine up till here. What I then want to do is go through the array and find any duplicates. If there is a duplicate, output it to the screen.... (3 Replies)
Discussion started by: dangral
3 Replies
Login or Register to Ask a Question