Unix/Linux Go Back    


Shell Programming and Scripting BSD, Linux, and UNIX shell scripting — Post awk, bash, csh, ksh, perl, php, python, sed, sh, shell scripts, and other shell scripting languages questions here.

How to check duplicate entries in file ? (Solaris-9)

Shell Programming and Scripting


Tags
solved

Reply    
 
Thread Tools Search this Thread Display Modes
    #1  
Old Unix and Linux 06-13-2017   -   Original Discussion by solaris_1977
solaris_1977 solaris_1977 is offline
Registered User
 
Join Date: Mar 2011
Last Activity: 28 September 2017, 4:34 PM EDT
Posts: 454
Thanks: 54
Thanked 4 Times in 4 Posts
How to check duplicate entries in file ? (Solaris-9)

Hi,
There are duplicate entries in file, but uniq will not see because first field is different. How will I catch all lines, which are having duplicate IPs ?

Code:
bash-2.05# cat db.file | grep 172.30.133.11
dsrq-ctrl1-prod         A       172.30.133.11
e911q-db1-nxge0         A       172.30.133.11
bash-2.05#
bash-2.05# cat db.file | grep 172.30.133.12
dsrq-ctrl2-prod         A       172.30.133.12
e911q-icp1-nxge0        A       172.30.133.12

Thanks
Sponsored Links
    #2  
Old Unix and Linux 06-13-2017   -   Original Discussion by solaris_1977
rdrtx1 rdrtx1 is offline
Registered User
 
Join Date: Sep 2012
Last Activity: 22 November 2017, 8:09 PM EST
Location: Houston, Texas, USA
Posts: 983
Thanks: 0
Thanked 338 Times in 319 Posts

Code:
awk 'NR==FNR {if ($NF ~ /[0-9][.][0-9]/) a[$NF]++; next}; {if (a[$NF] > 1) print $0}' db.file db.file

try also either /usr/xpg4/bin/awk or nawk in Solaris.

Last edited by rdrtx1; 06-15-2017 at 02:42 PM..
Sponsored Links
    #3  
Old Unix and Linux 06-13-2017   -   Original Discussion by solaris_1977
solaris_1977 solaris_1977 is offline
Registered User
 
Join Date: Mar 2011
Last Activity: 28 September 2017, 4:34 PM EDT
Posts: 454
Thanks: 54
Thanked 4 Times in 4 Posts
This command ran and came back on prompt, without any output

Code:
bash-2.05$ cat db.file | wc -l
   11354
bash-2.05$ awk 'NR==FNR {if ($NF ~ /[0-9][.][0-9]/) a[$NF]++; next} a[$NF] > 1' db.file db.file
bash-2.05$ cat db.file | wc -l
   11354
bash-2.05$

    #4  
Old Unix and Linux 06-14-2017   -   Original Discussion by solaris_1977
MadeInGermany MadeInGermany is offline Forum Staff  
Moderator
 
Join Date: May 2012
Last Activity: 23 November 2017, 10:59 AM EST
Location: Simplicity
Posts: 3,831
Thanks: 319
Thanked 1,284 Times in 1,162 Posts
It works.
Ensure there are db.file db.file command arguments (needs to read the file twice),
and use /usr/xpg4/bin/awk or nawk!
The Following User Says Thank You to MadeInGermany For This Useful Post:
solaris_1977 (06-14-2017)
Sponsored Links
    #5  
Old Unix and Linux 06-14-2017   -   Original Discussion by solaris_1977
solaris_1977 solaris_1977 is offline
Registered User
 
Join Date: Mar 2011
Last Activity: 28 September 2017, 4:34 PM EDT
Posts: 454
Thanks: 54
Thanked 4 Times in 4 Posts
Thanks. Didn't knew, that is different awk
Sponsored Links
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Linux More UNIX and Linux Forum Topics You Might Find Helpful
Thread Thread Starter Forum Replies Last Post
Check to identify duplicate values at first column in csv file avikaljain Shell Programming and Scripting 4 05-09-2013 06:20 AM
Request to check:remove entries with duplicate numbers in first row manigrover Shell Programming and Scripting 3 07-31-2012 09:57 AM
duplicate entries /.rhosts file akash_mahakode Shell Programming and Scripting 2 10-12-2009 02:10 AM
Removal of Duplicate Entries from the file ravi_rn Shell Programming and Scripting 1 11-04-2008 06:56 AM
Check host file for duplicate entries ThreeDot Shell Programming and Scripting 4 01-21-2008 07:16 PM



All times are GMT -4. The time now is 02:30 PM.