Removing duplicates on a single "column" (delimited file)


Login or Register for Dates, Times and to Reply

 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Removing duplicates on a single "column" (delimited file)
# 1  
Removing duplicates on a single "column" (delimited file)

Hello !

I'm quite new to linux but haven't found a script to do this task, unfortunately my knowledge is quite limited on shellscripts...

Could you guys help me removing the duplicate lines of a file, based only on a single "column"?

For example:

Code:
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591331;175130959;000074873-AB;9.9;RIO DE JANEIRO
M202034357;01/2008;J30AP096;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589332;175123672;000001842-AB;9.9;MACAPA
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000043657;01F010800002;118000613348;175138146;000161122-AA;9.9;RIO DE JANEIRO
M202034357;01/2008;J30DF061;Ciclo 06 de Faturamento;4000034956;06F010800020;269800607228;173691920;000030011-AA;9.9;GUARA
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500588743;175121705;000188224-AA;9.9;NITEROI
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589299;175123639;000241055-AB;9.9;SAO PAULO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589787;175125437;000256241-AB;9.9;SAO PAULO
M202034357;01/2008;J30AM097;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614870;175142866;000026153-AA;4.99;MANAUS
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614087;175140485;000023707-AA;9.9;BELEM
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043785;01F010800027;270200624370;175114167;000011219-AB;9.9;BELÉM
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591956;175132948;000441734-AA;9.9;SAO BERNARDO DO CAMPO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500590036;175126399;000458131-AA;9.9;SAO CAETANO DO SUL
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591958;175132950;000441735-AA;9.9;SAO PAULO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000043657;01F010800002;118000612017;175130959;000469327-AA;9.9;GUARULHOS

So, the yellow field is found duplicate on a few lines... Like the first and last ones. But the data between them are different many times.

It doesn't matter for my purpose to have the ocurrence twice, even if the info before and after is different... So what I need is a script (maybe awk or cut) that recognizes the same string on position 8 and, if it was already found before, delete that whole line, but keep every other lines that do not contain a repeated string at position 8.

Ideas?

Last edited by jim mcnamara; 01-28-2016 at 05:15 PM.. Reason: code tags
# 2  
Try:
Code:
awk -F ';'  '!arr[$8]++' oldfile > newfile

This User Gave Thanks to jim mcnamara For This Post:
# 3  
Thanks you very much, it worked... I shall be studying this function from now on... It is proving to be very useful.
# 4  
The following variant saves some memory (an integer per line):
Code:
awk -F ';'  '!($8 in A) {A[$8]; print}' oldfile > newfile

# 5  
I prefer the awk solutions suggested by Jim McNamara and MadeInGermany for your stated problem, but you could also consider this alternative for cases where you want the output sorted on the field you're using to select records:
Code:
sort -t';' -u -k8,8 oldfile > newfile

which, with your sample input in oldfile, produces the output:
Code:
M202034357;01/2008;J30DF061;Ciclo 06 de Faturamento;4000034956;06F010800020;269800607228;173691920;000030011-AA;9.9;GUARA
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043785;01F010800027;270200624370;175114167;000011219-AB;9.9;BELÉM
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500588743;175121705;000188224-AA;9.9;NITEROI
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589299;175123639;000241055-AB;9.9;SAO PAULO
M202034357;01/2008;J30AP096;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589332;175123672;000001842-AB;9.9;MACAPA
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500589787;175125437;000256241-AB;9.9;SAO PAULO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500590036;175126399;000458131-AA;9.9;SAO CAETANO DO SUL
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591331;175130959;000074873-AB;9.9;RIO DE JANEIRO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591956;175132948;000441734-AA;9.9;SAO BERNARDO DO CAMPO
M202034357;01/2008;J30SP011;Ciclo 01 de Faturamento;4000029579;01F010800017;270500591958;175132950;000441735-AA;9.9;SAO PAULO
M202034357;01/2008;J30RJ021;Ciclo 01 de Faturamento;4000043657;01F010800002;118000613348;175138146;000161122-AA;9.9;RIO DE JANEIRO
M202034357;01/2008;J30PA091;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614087;175140485;000023707-AA;9.9;BELEM
M202034357;01/2008;J30AM097;Ciclo 01 de Faturamento;4000043657;01F010800002;118000614870;175142866;000026153-AA;4.99;MANAUS

in newfile.
Login or Register for Dates, Times and to Reply

Previous Thread | Next Thread
Thread Tools Search this Thread
Search this Thread:
Advanced Search

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Bash script - Print an ascii file using specific font "Latin Modern Mono 12" "regular" "9"

Hello. System : opensuse leap 42.3 I have a bash script that build a text file. I would like the last command doing : print_cmd -o page-left=43 -o page-right=22 -o page-top=28 -o page-bottom=43 -o font=LatinModernMono12:regular:9 some_file.txt where : print_cmd ::= some printing... (1 Reply)
Discussion started by: jcdole
1 Replies

2. Shell Programming and Scripting

replace the contents of first column of file "X" with second Column of file "X" in file "Y"

Hi! I am having 02 files. In first file" X" I am having 02 Columns TCP-5100 Sybase_5100 TCP-5600 Sybase_5600 Second file "Y" for example-- :services ( :AdminInfo ( :chkpf_uid ("{A2F79713-B67D-4409-83A4-A90804E983E9}") :ClassName (rule_services) ) :compound ()... (12 Replies)
Discussion started by: shahid1632
12 Replies

3. UNIX for Dummies Questions & Answers

Replacing "." with "GG" in a certain column of a file that has heading

Hi, all, I have a file that looks like: ## XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX ## YYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYYY #AA AB AC AD AE AF AG AH AI AJ AK AL 20 60039 60039 ... (5 Replies)
Discussion started by: kush
5 Replies

4. UNIX for Dummies Questions & Answers

Using "mailx" command to read "to" and "cc" email addreses from input file

How to use "mailx" command to do e-mail reading the input file containing email address, where column 1 has name and column 2 containing “To” e-mail address and column 3 contains “cc” e-mail address to include with same email. Sample input file, email.txt Below is an sample code where... (2 Replies)
Discussion started by: asjaiswal
2 Replies

5. Shell Programming and Scripting

Removing duplicates from delimited file based on 2 columns

Hi guys,Got a bit of a bind I'm in. I'm looking to remove duplicates from a pipe delimited file, but do so based on 2 columns. Sounds easy enough, but here's the kicker... Column #1 is a simple ID, which is used to identify the duplicate. Once dups are identified, I need to only keep the one... (2 Replies)
Discussion started by: kevinprood
2 Replies

6. Shell Programming and Scripting

PERL "filtering the log file removing the duplicates

Hi folks, I have a log file in the below format and trying to get the output of the unique ones based on mnemonic IN PERL. Could any one please let me know with the code and the logic ? Severity Mnemonic Log Message 7 CLI_SCHEDULER Logfile for scheduled CLI... (3 Replies)
Discussion started by: scriptscript
3 Replies

7. Shell Programming and Scripting

Cant get awk 1liner to remove duplicate lines from Delimited file, get "event not found" error..help

Hi, I am on a Solaris8 machine If someone can help me with adjusting this awk 1 liner (turning it into a real awkscript) to get by this "event not found error" ...or Present Perl solution code that works for Perl5.8 in the csh shell ...that would be great. ****************** ... (3 Replies)
Discussion started by: andy b
3 Replies

8. Shell Programming and Scripting

awk command to replace ";" with "|" and ""|" at diferent places in line of file

Hi, I have line in input file as below: 3G_CENTRAL;INDONESIA_(M)_TELKOMSEL;SPECIAL_WORLD_GRP_7_FA_2_TELKOMSEL My expected output for line in the file must be : "1-Radon1-cMOC_deg"|"LDIndex"|"3G_CENTRAL|INDONESIA_(M)_TELKOMSEL"|LAST|"SPECIAL_WORLD_GRP_7_FA_2_TELKOMSEL" Can someone... (7 Replies)
Discussion started by: shis100
7 Replies

9. Shell Programming and Scripting

how to create flat file delimited by "\002"

I need to create a flat file with columns delimited by "\002" (octal 2) I tried using the simple echo. name="Adam Smith" age=40 address="1 main st" city="New York" echo ${name}"\002"${age}"\002"${address}"\002"${city} > mytmp but it creates a delimiter with different octal... (4 Replies)
Discussion started by: injey
4 Replies

10. Shell Programming and Scripting

"Join" or "Merge" more than 2 files into single output based on common key (column)

Hi All, I have working (Perl) code to combine 2 input files into a single output file using the join function that works to a point, but has the following limitations: 1. I am restrained to 2 input files only. 2. Only the "matched" fields are written out to the "matched" output file and... (1 Reply)
Discussion started by: Katabatic
1 Replies

Featured Tech Videos