finding duplicates in columns and removing lines


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting finding duplicates in columns and removing lines
# 15  
Old 05-16-2008
# 16  
Old 11-29-2008
can you advice for this one also ... please, thanks

Dears,

I need to make the field number 7 ($7) uniq for the below input:

BSC38_E709 3025-Faiaz-43 43 2-0 SWRF9139V CTU2 X79T7H05ET B-U 2008-11-27
BSC38_E709 3025-_Faiaz-43 43 2-1 SWRF9139V CTU2 X79T7H05ET B-U 2008-11-14
BSC38_E709 3026-Rafgah-5 5 1-0 SWRF9139V CTU2 X79T7H06U3 B-U 2008-11-27
BSC38_E709 3026-Rafgah-5 5 1-1 SWRF9139V CTU2 X79T7H06U3 B-U 2008-11-14
BSC38_E709 3026-Rafgah-5 5 2-0 SWRF9139V CTU2 X79T7H06SM B-U 2008-11-27
BSC38_E709 3026-Rafgah-5 5 2-1 SWRF9139V CTU2 X79T7H06SM B-U 2008-11-14

and the output should be as below:

BSC38_E709 3025-Faiaz-43 43 2-0 SWRF9139V CTU2 X79T7H05ET B-U 2008-11-27
BSC38_E709 3026-Rafgah-5 5 1-0 SWRF9139V CTU2 X79T7H06U3 B-U 2008-11-27
BSC38_E709 3026-Rafgah-5 5 2-0 SWRF9139V CTU2 X79T7H06SM B-U 2008-11-27

Note: uniq for column number 7 & order to print the entire line,

Your feedback is highly appreciated, thanks.
# 17  
Old 11-29-2008
Did you search the forum first?
Code:
awk '! _[$7]++' file

https://www.unix.com/shell-programmin...#post302189002
# 18  
Old 11-29-2008
yes I did and it's didn't work,

I used the below one but it's take too long time :

touch D22
for id in ` cat D3 | awk '/BSC/{print $13}' | uniq`
do
grep $id D3 | head -1 >> D22
wait
done

note: D22 is output file and D3 is the input file.

is there any other suggestion ?? , thanks.

Last edited by ahmad_khouly; 11-29-2008 at 11:39 AM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing carriage returns from multiple lines in multiple files of different number of columns

Hello Gurus, I have a multiple pipe separated files which have records going over multiple Lines. End of line separator is \n and records going over multiple lines have <CR> as separator. below is example from one file. 1|ABC DEF|100|10 2|PQ RS T|200|20 3| UVWXYZ|300|30 4| GHIJKL|400|40... (7 Replies)
Discussion started by: dJHa
7 Replies

2. Shell Programming and Scripting

Removing duplicates from delimited file based on 2 columns

Hi guys,Got a bit of a bind I'm in. I'm looking to remove duplicates from a pipe delimited file, but do so based on 2 columns. Sounds easy enough, but here's the kicker... Column #1 is a simple ID, which is used to identify the duplicate. Once dups are identified, I need to only keep the one... (2 Replies)
Discussion started by: kevinprood
2 Replies

3. Shell Programming and Scripting

UNIX scripting for finding duplicates and null records in pk columns

Hi, I have a requirement.for eg: i have a text file with pipe symbol as delimiter(|) with 4 columns a,b,c,d. Here a and b are primary key columns.. i want to process that file to find the duplicates and null values are in primary key columns(a,b) . I want to write the unique records in which... (5 Replies)
Discussion started by: praveenraj.1991
5 Replies

4. Shell Programming and Scripting

Removing duplicates in fixed width file which has multiple key columns

Hi All , I have a requirement where I need to remove duplicates from a fixed width file which has multiple key columns .Also , need to capture the duplicate records into another file . File has 8 columns. Key columns are col1 and col2. Col1 has the length of 8 col 2 has the length of 3. ... (5 Replies)
Discussion started by: saj
5 Replies

5. Shell Programming and Scripting

Help in removing duplicates

I have an input file abc.txt with info like: abcd rateuse inklite robet rateuse abcd I need to remove duplicates from the file (eg: abcd,rateuse) from the file and need to place the contents in same file abc.txt if needed can be placed in another file. can anyone help me in this :( (4 Replies)
Discussion started by: rkrish
4 Replies

6. Shell Programming and Scripting

finding duplicates in csv based on key columns

Hi team, I have 20 columns csv files. i want to find the duplicates in that file based on the column1 column10 column4 column6 coulnn8 coulunm2 . if those columns have same values . then it should be a duplicate record. can one help me on finding the duplicates, Thanks in advance. ... (2 Replies)
Discussion started by: baskivs
2 Replies

7. Shell Programming and Scripting

Removing duplicates from string (not duplicate lines)

please help me in getting following: Input Desired output x="foo" foo x="foo foo" foo x="foo foo" foo x="foo abc foo" foo abc x="foo foo1 foo2" foo foo1 foo2 I need to remove duplicated from string.. (8 Replies)
Discussion started by: vickylife
8 Replies

8. Shell Programming and Scripting

Finding duplicates from positioned substring across lines

I have million's of records each containing exactly 50 characters and have to check the uniqueness of 4 character substring of 50 character (postion known prior) and report if any duplicates are found. Eg. data... AAAA00000000000000XXXX0000 0000000000... upto50 chars... (2 Replies)
Discussion started by: gapprasath
2 Replies

9. Shell Programming and Scripting

Help removing lines with duplicated columns

Hi Guys... Please Could you help me with the following ? aaaa bbbb cccc sdsd aaaa bbbb cccc qwer as you can see, the 2 lines are matched in three fields... how can I delete this pupicate ? I mean to delete the second one if 3 fields were duplicated ? Thanks (14 Replies)
Discussion started by: yahyaaa
14 Replies

10. UNIX for Dummies Questions & Answers

Removing lines that are (same in content) based on columns

I have a file which looks like AA BB CC DD EE FF GG HH KK AA BB GG HH KK FF CC DD EE AA BB CC DD EE UU VV XX ZZ AA BB VV XX ZZ UU CC DD EE .... I want the script to give me only one line based on duplicate contents: AA BB CC DD EE FF GG HH KK AA BB CC DD EE UU VV XX ZZ (7 Replies)
Discussion started by: adsforall
7 Replies
Login or Register to Ask a Question