delete from line and remove duplicates


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting delete from line and remove duplicates
# 1  
Old 02-20-2012
Bug delete from line and remove duplicates

My Input.....file1

Code:
ABCDE4435 Connected to 107.71.136.122 (SubNetwork=ONRM_RootMo_R SubNetwork=XYVLTN29CRBR99 MeContext=ABCDE4435 ManagedElement=1)	
ABCDE4478 Connected to 166.208.30.57 (SubNetwork=ONRM_RootMo_R SubNetwork=KLFMTN29CR0R04 MeContext=ABCDE4478 ManagedElement=1)	
ABCDE4478 Connected to 166.208.30.57 (SubNetwork=ONRM_RootMo_R SubNetwork=KLFMTN29CR0R04 MeContext=ABCDE4478 ManagedElement=1)	
ABCDE4503 Connected to 166.208.22.41 (SubNetwork=ONRM_RootMo_R SubNetwork=NSVLTR0R888 MeContext=ABCDE4503 ManagedElement=1)	
ABCDE4503 Connected to 166.208.22.41 (SubNetwork=ONRM_RootMo_R SubNetwork=NSVLTR0R888 MeContext=ABCDE4503 ManagedElement=1)


I need output like this....file2


Code:
ABCDE4435	XYVLTN29CRBR99
ABCDE4478	KLFMTN29CR0R04
ABCDE4503	NSVLTR0R888


Last edited by methyl; 02-20-2012 at 08:29 PM.. Reason: please use code tags
# 2  
Old 02-20-2012
Code:
 awk '{sub(/SubNetwork=/,"",$6); print $1"\t"$6}' file1 > file2

This User Gave Thanks to balajesuri For This Post:
# 3  
Old 02-20-2012
Code:
awk -F'[= ]' '{a[$1" "$8]=$1" "$8} END{for(i in a) print a[i]}' file1

This User Gave Thanks to complex.invoke For This Post:
# 4  
Old 02-20-2012
One method:
Code:
awk '{print $1,$6}' file1 | sed -e "s/SubNetwork=//g"|sort | uniq > file2

ABCDE4435 XYVLTN29CRBR99
ABCDE4478 KLFMTN29CR0R04
ABCDE4503 NSVLTR0R888

This User Gave Thanks to methyl For This Post:
# 5  
Old 02-20-2012
Code:
$
$ cat file1
ABCDE4435 Connected to 107.71.136.122 (SubNetwork=ONRM_RootMo_R SubNetwork=XYVLTN29CRBR99 MeContext=ABCDE4435 ManagedElement=1)
ABCDE4478 Connected to 166.208.30.57 (SubNetwork=ONRM_RootMo_R SubNetwork=KLFMTN29CR0R04 MeContext=ABCDE4478 ManagedElement=1)
ABCDE4478 Connected to 166.208.30.57 (SubNetwork=ONRM_RootMo_R SubNetwork=KLFMTN29CR0R04 MeContext=ABCDE4478 ManagedElement=1)
ABCDE4503 Connected to 166.208.22.41 (SubNetwork=ONRM_RootMo_R SubNetwork=NSVLTR0R888 MeContext=ABCDE4503 ManagedElement=1)
ABCDE4503 Connected to 166.208.22.41 (SubNetwork=ONRM_RootMo_R SubNetwork=NSVLTR0R888 MeContext=ABCDE4503 ManagedElement=1)
$
$
$ perl -lne 's/^(.*?)\s+.*SubNetwork=(.*?)\s+.*$/$1\t$2/; defined $x{$_}?"":print; $x{$_}++' file1
ABCDE4435       XYVLTN29CRBR99
ABCDE4478       KLFMTN29CR0R04
ABCDE4503       NSVLTR0R888
$
$

tyler_durden
# 6  
Old 02-21-2012
Thanks a lot guys .......its working fine//////


Great help /////SmilieSmilieSmilieSmilieSmilieSmilieSmilieSmilie
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies

2. Shell Programming and Scripting

Delete duplicates in CA bundle

I do have a big CA bundle certificate file and each time if i get request to add new certificate to the existing bundle i need to make sure it is not present already. How i can validate the duplicates. The alignment of the certificate within the bundle seems to be different. Example: Cert 1... (7 Replies)
Discussion started by: diva_thilak
7 Replies

3. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

4. UNIX for Dummies Questions & Answers

script to remove duplicates per line

Hello experts! I'd like a way to remove duplicates per line. Strings are enclosed in brackets, and I would prefer to maintain the order of the file: example input (56)(63) (56)(70)(56)(70)(24) (25)(78) (12)(33)(12) (10) (10) desired output (56)(63) (56)(70)(24) (25)(78)... (5 Replies)
Discussion started by: torchij
5 Replies

5. Shell Programming and Scripting

Delete lines containing and remove the blank line at the same time

Is there a way to delete a line containing something and the blank line at the same time? If you do this it leaves a blank line behind. sed '/yum/d' .bash_historyI know this works but I would think there would be a way to do it with one command sed '/yum/d' .bash_history | sed '/^$/d'In... (2 Replies)
Discussion started by: cokedude
2 Replies

6. Shell Programming and Scripting

awk delete/remove rest of line on multiple search pattern

Need to remove rest of line after the equals sign on search pattern from the searchfile. Can anybody help. Couldn't find any similar example in the forum: infile: 64_1535: Delm. = 86 var, aaga 64_1535: Fran. = 57 ex. ccc 64_1639: Feb. = 26 (link). def 64_1817: mar. = 3/4. drz ... (7 Replies)
Discussion started by: sdf
7 Replies

7. Shell Programming and Scripting

Delete duplicates via script?

Hello, i have the following problem: there are two folders with a lot of files. Example: FolderA contains AAA, BBB, CCC FolderB contains DDD, EEE, AAA How can i via script identify AAA as duplicate in Folder B and delete it there? So that only DDD and EEE remain, in Folder B? Thank you... (16 Replies)
Discussion started by: Y-T
16 Replies

8. Shell Programming and Scripting

how can I delete duplicates in the log?

I have a log file and I am trying to run a script against it to search for key issues such as invalid users, errors etc. In one part, I grep for session closed and get a lot of the same thing,, ie. root username etc. I want to remove the multiple root and just have it do a count, like wc -l ... (5 Replies)
Discussion started by: taekwondo
5 Replies

9. Shell Programming and Scripting

How can i delete the duplicates based on one column of a line

I have my data something like this (08/03/2009 22:57:42.414)(:) king aaaaaaaaaaaaaaaa bbbbbbbbbbbbbbbbbbbbbb (08/03/2009 22:57:42.416)(:) John cccccccccccc cccccvssssssssss baaaaa (08/03/2009 22:57:42.417)(:) Michael ddddddd tststststtststts (08/03/2009 22:57:42.425)(:) Ravi... (11 Replies)
Discussion started by: rdhanek
11 Replies

10. Shell Programming and Scripting

An interactive way to delete duplicates

1)I am trying to write a script that works interactively lists duplicated records on certain field/column and asks user to delete one or more. And finally it deletes all the records the used has asked for. I have an idea to store those line numbers in an array, not sure how to do this in... (3 Replies)
Discussion started by: chvs2000
3 Replies
Login or Register to Ask a Question