Remove duplicates based on query and subject fields from blast output file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Remove duplicates based on query and subject fields from blast output file
# 1  
Old 05-15-2012
Remove duplicates based on query and subject fields from blast output file

Hi all
I have a blast outfile file like this :
Code:
NZ_1540841_1561981 ICMP_1687819_1695946 92.59 27 2 0 12826 12852 3136 3162 0.28 38.2
NZ_1540841_1561981 ICMP_1687819_1695946 95.65 23 1 0 12268 12290 5815 5837 0.28 38.2
NZ_1540841_1561981 ICMP_3674888_3676546 82.70 185 32 0 9454 9638 11 195 6e-24  113
NZ_1540841_1561981 ICMP_3674888_3676546 83.33 90 15 0 1096 1185 1316 1405 8e-08 60.0
NZ_1565476_1586461 ICMP_1699632_1701095 81.16 329 62 0 5681 6009 389 61 3e-38  161
NZ_1565476_1586461 ICMP_1699632_1701095 85.90 156 22 0 4804 4959 1260 1105 2e-30  135
NZ_1565476_1586461 ICMP_1678249_1687718 88.83 913 102 0 7798 8710 8558 9470 0.0 1013
NZ_1565476_1586461 ICMP_1678249_1687718 90.42 522 50 0 8968 9489 11 532 0.0  638

and so on
I want the output to be as:
Code:
NZ_1540841_1561981 ICMP_1687819_1695946 92.59 27 2 0 12826 12852 3136 3162 0.28 38.2
NZ_1540841_1561981 ICMP_3674888_3676546 82.70 185 32 0 9454 9638 11 195 6e-24  113
NZ_1565476_1586461 ICMP_1699632_1701095 81.16 329 62 0 5681 6009 389 61 3e-38  161
NZ_1565476_1586461 ICMP_1678249_1687718 88.83 913 102 0 7798 8710 8558 9470 0.0 1013

Can someone Please, help me in writing a script using Python ? SORRY, I am just a beginner !!
Thanks in advance !

Moderator's Comments:
Mod Comment How to use code tags

Last edited by Franklin52; 05-15-2012 at 04:44 AM.. Reason: Please use code tags
# 2  
Old 05-15-2012
AWK ??

Code:
awk '!a[$1$2]++' filename

# 3  
Old 05-15-2012
Thanx pravin27, it worked Smilie
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. UNIX for Beginners Questions & Answers

Sort and remove duplicates in directory based on first 5 columns:

I have /tmp dir with filename as: 010020001_S-FOR-Sort-SYEXC_20160229_2212101.marker 010020001_S-FOR-Sort-SYEXC_20160229_2212102.marker 010020001-S-XOR-Sort-SYEXC_20160229_2212104.marker 010020001-S-XOR-Sort-SYEXC_20160229_2212105.marker 010020001_S-ZOR-Sort-SYEXC_20160229_2212106.marker... (4 Replies)
Discussion started by: gnnsprapa
4 Replies

2. Shell Programming and Scripting

Remove duplicate lines from file based on fields

Dear community, I have to remove duplicate lines from a file contains a very big ammount of rows (milions?) based on 1st and 3rd columns The data are like this: Region 23/11/2014 09:11:36 41752 Medio 23/11/2014 03:11:38 4132 Info 23/11/2014 05:11:09 4323... (2 Replies)
Discussion started by: Lord Spectre
2 Replies

3. Shell Programming and Scripting

Trying to remove duplicates based on field and row

I am trying to see if I can use awk to remove duplicates from a file. This is the file: -==> Listvol <== deleting /vol/eng_rmd_0941 deleting /vol/eng_rmd_0943 deleting /vol/eng_rmd_0943 deleting /vol/eng_rmd_1006 deleting /vol/eng_rmd_1012 rearrange /vol/eng_rmd_0943 ... (6 Replies)
Discussion started by: newbie2010
6 Replies

4. Shell Programming and Scripting

Remove duplicates based on a field's value

Hi All, I have a text file with three columns. I would like a simple script that removes lines in which column 1 has duplicate entries, but use the largest value in column 3 to decide which one to keep. For example: Input file: 12345a rerere.rerere len=23 11111c fsdfdf.dfsdfdsf len=33 ... (3 Replies)
Discussion started by: anniecarv
3 Replies

5. UNIX for Dummies Questions & Answers

remove duplicates based on a field and criteria

Hi, I have a file with fields like below: A;XYZ;102345;222 B;XYZ;123243;333 C;ABC;234234;444 D;MNO;103345;222 E;DEF;124243;333 desired output: C;ABC;234234;444 D;MNO;103345;222 E;DEF;124243;333 ie, if the 4rth field is a duplicate.. i need only those records where... (5 Replies)
Discussion started by: wanderingmind16
5 Replies

6. Shell Programming and Scripting

remove duplicates based on single column

Hello, I am new to shell scripting. I have a huge file with multiple columns for example: I have 5 columns below. HWUSI-EAS000_29:1:105 + chr5 76654650 AATTGGAA HHHHG HWUSI-EAS000_29:1:106 + chr5 76654650 AATTGGAA B@HYL HWUSI-EAS000_29:1:108 + ... (4 Replies)
Discussion started by: Diya123
4 Replies

7. Shell Programming and Scripting

Search based on 1,2,4,5 columns and remove duplicates in the same file.

Hi, I am unable to search the duplicates in a file based on the 1st,2nd,4th,5th columns in a file and also remove the duplicates in the same file. Source filename: Filename.csv "1","ccc","information","5000","temp","concept","new" "1","ddd","information","6000","temp","concept","new"... (2 Replies)
Discussion started by: onesuri
2 Replies

8. Shell Programming and Scripting

Remove duplicates based on the two key columns

Hi All, I needs to fetch unique records based on a keycolumn(ie., first column1) and also I needs to get the records which are having max value on column2 in sorted manner... and duplicates have to store in another output file. Input : Input.txt 1234,0,x 1234,1,y 5678,10,z 9999,10,k... (7 Replies)
Discussion started by: kmsekhar
7 Replies

9. UNIX for Dummies Questions & Answers

Remove duplicates based on a column in fixed width file

Hi, How to output the duplicate record to another file. We say the record is duplicate based on a column whose position is from 2 and its length is 11 characters. The file is a fixed width file. ex of Record: DTYU12333567opert tjhi kkklTRG9012 The data in bold is the key on which... (1 Reply)
Discussion started by: Qwerty123
1 Replies
Login or Register to Ask a Question