Sponsored Content
Top Forums Shell Programming and Scripting find duplicate records... again Post 302280442 by Ygor on Tuesday 27th of January 2009 12:39:36 AM
Old 01-27-2009
Try...
Code:
gawk 'BEGIN{RS="\\*\\*\n+";ORS="**\n"}
      NR==FNR{a[$5,$6,$7,$9,$10]++;next}
      {print $0 > FILENAME "." (a[$5,$6,$7,$9,$10]==1?"uniq":"dupe")}' file file

Tested...
Code:
$ head -1000 file.*
==> file.dupe <==
XX ME   342 8688 2006  7  6 3c  60.029  -38.568  2901 0001   74   4 7603  8
    969.8    958.4   3.6320  34.8630
    985.5    973.9   3.6130  34.8600
    998.7    986.9   3.6070  34.8610
   1003.6    991.7   3.6240  34.8660
**
XX ME   342 8688 2006  7  6 3c  60.029  -38.568  2901 0001   74   4 7603  8
      1.6      1.6   8.9330  34.9230
     13.5     13.4   8.4880  34.9200
**

==> file.uniq <==
XX ME   342 8689 2006  7  6 3c  60.065  -38.617  2890 0001   74   4 7603  8
    960.9    949.6   3.6020  34.8580
    976.5    965.0   3.5870  34.8580
    991.6    979.9   3.5800  34.8580
   1002.8    990.9   3.5760  34.8580
   1003.9    992.0   3.5760  34.8590
**
XX ME   342 9690 2006  7  7 3c  60.100  -38.669  2876 0001   74   4 7603  8
    975.3    963.8   3.5820  34.8580
    992.3    980.6   3.5660  34.8570
   1003.3    991.4   3.5640  34.8580
   1004.4    992.5   3.5630  34.8590
**
$

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Records Duplicate

Hi Everyone, I have a flat file of 1000 unique records like following : For eg Andy,Flower,201-987-0000,12/23/01 Andrew,Smith,101-387-3400,11/12/01 Ani,Ross,401-757-8640,10/4/01 Rich,Finny,245-308-0000,2/27/06 Craig,Ford,842-094-8740,1/3/04 . . . . . . Now I want to duplicate... (9 Replies)
Discussion started by: ganesh123
9 Replies

2. Shell Programming and Scripting

How to find Duplicate Records in a text file

Hi all pls help me by providing soln for my problem I'm having a text file which contains duplicate records . Example: abc 1000 3452 2463 2343 2176 7654 3452 8765 5643 3452 abc 1000 3452 2463 2343 2176 7654 3452 8765 5643 3452 tas 3420 3562 ... (1 Reply)
Discussion started by: G.Aavudai
1 Replies

3. Shell Programming and Scripting

find out duplicate records in file?

Dear All, I have one file which looks like : account1:passwd1 account2:passwd2 account3:passwd3 account1:passwd4 account5:passwd5 account6:passwd6 you can see there're two records for account1. and is there any shell command which can find out : account1 is the duplicate record in... (3 Replies)
Discussion started by: tiger2000
3 Replies

4. Shell Programming and Scripting

Find Duplicate records in first Column in File

Hi, Need to find a duplicate records on the first column, ANU4501710430989 0000000W20389390 ANU4501710430989 0000000W67065483 ANU4501130050520 0000000W80838713 ANU4501210170685 0000000W69246611... (3 Replies)
Discussion started by: Murugesh
3 Replies

5. UNIX for Dummies Questions & Answers

Getting non-duplicate records

Hi, I have a file with these records abc xyz xyz pqr uvw cde cde In my o/p file , I want all the non duplicate rows to be shown. o/p abc pqr uvw Any suggestions how to do this? Thanks for the help. rs (2 Replies)
Discussion started by: rs123
2 Replies

6. UNIX for Dummies Questions & Answers

CSV file:Find duplicates, save original and duplicate records in a new file

Hi Unix gurus, Maybe it is too much to ask for but please take a moment and help me out. A very humble request to you gurus. I'm new to Unix and I have started learning Unix. I have this project which is way to advanced for me. File format: CSV file File has four columns with no header... (8 Replies)
Discussion started by: arvindosu
8 Replies

7. UNIX for Dummies Questions & Answers

Need to keep duplicate records

Consider my input is 10 10 20 then, uniq -u will give 20 and uniq -dwill return 10. But i need the output as , 10 10 How we can achieve this? Thanks (4 Replies)
Discussion started by: pandeesh
4 Replies

8. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

9. Shell Programming and Scripting

Deleting duplicate records from file 1 if records from file 2 match

I have 2 files "File 1" is delimited by ";" and "File 2" is delimited by "|". File 1 below (3 record shown): Doc1;03/01/2012;New York;6 Main Street;Mr. Smith 1;Mr. Jones Doc2;03/01/2012;Syracuse;876 Broadway;John Davis;Barbara Lull Doc3;03/01/2012;Buffalo;779 Old Windy Road;Charles... (2 Replies)
Discussion started by: vestport
2 Replies

10. Shell Programming and Scripting

Duplicate records

Gents, Please give a help file --BAD STATUS NOT RESHOOTED-- *** VP 41255/51341 in sw 2973 *** VP 41679/51521 in sw 2973 *** VP 41687/51653 in sw 2973 *** VP 41719/51629 in sw 2976 --BAD COG NOT RESHOOTED-- *** VP 41689/51497 in sw 2974 *** VP 41699/51677 in sw 2974 *** VP... (18 Replies)
Discussion started by: jiam912
18 Replies
UNIQ(1) 							   User Commands							   UNIQ(1)

NAME
uniq - report or omit repeated lines SYNOPSIS
uniq [OPTION]... [INPUT [OUTPUT]] DESCRIPTION
Discard all but one of successive identical lines from INPUT (or standard input), writing to OUTPUT (or standard output). Mandatory arguments to long options are mandatory for short options too. -c, --count prefix lines by the number of occurrences -d, --repeated only print duplicate lines -D, --all-repeated[=delimit-method] print all duplicate lines delimit-method={none(default),prepend,separate} Delimiting is done with blank lines. -f, --skip-fields=N avoid comparing the first N fields -i, --ignore-case ignore differences in case when comparing -s, --skip-chars=N avoid comparing the first N characters -u, --unique only print unique lines -z, --zero-terminated end lines with 0 byte, not newline -w, --check-chars=N compare no more than N characters in lines --help display this help and exit --version output version information and exit A field is a run of blanks (usually spaces and/or TABs), then non-blank characters. Fields are skipped before chars. Note: 'uniq' does not detect repeated lines unless they are adjacent. You may want to sort the input first, or use `sort -u' without `uniq'. AUTHOR
Written by Richard M. Stallman and David MacKenzie. REPORTING BUGS
Report uniq bugs to bug-coreutils@gnu.org GNU coreutils home page: <http://www.gnu.org/software/coreutils/> General help using GNU software: <http://www.gnu.org/gethelp/> COPYRIGHT
Copyright (C) 2009 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>. This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. SEE ALSO
The full documentation for uniq is maintained as a Texinfo manual. If the info and uniq programs are properly installed at your site, the command info coreutils 'uniq invocation' should give you access to the complete manual. GNU coreutils 7.1 July 2010 UNIQ(1)
All times are GMT -4. The time now is 01:10 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy