I have a input file with formating:
6000000901 ;36200103 ;h3a01f496 ;
2000123605 ;36218982 ;heefa1328 ;
2000273132 ;36246985 ;h08c5cb71 ;
2000041207 ;36246985 ;heef75497 ;
Each fields is seperated by semi-comma. Sometime, the second files is... (6 Replies)
Hi:
I've been searching the net but didnt find a clue. I have a file in which, for some records, some fields coincide. I want to compare one (or more) of the dissimilar fields and retain the one record that fulfills a certain condition. For example, on this file:
99 TR 1991 5 06 ... (1 Reply)
I have file that I need to remove the duplicates. The problem is, I need to only keep the one which has a unique 3rd field. Here is a sample file:
xxx.xxx:x:CISCO1.CLEVE61W:ERIE.NET:x:x:x:x:
xxx.xxx:x:CISCO2.CLEVE62W:OHIO.NET:x:x:x:x:
xxx.xxx:x:CISCO2.CLEVE62W:NORTH.NET:x:x:x:x:... (1 Reply)
Hello,
Although I have found similar questions, I could not find advice that
could help with our problem.
The issue:
We have several hundreds text files containing repeated blocks of text
(I guess back at the time they were prepared like that to optmize
printing).
The block of texts... (13 Replies)
Hi,
How can I remove duplicates from a file based on group on other column? for example:
Test1|Test2|Test3|Test4|Test5
Test1|Test6|Test7|Test8|Test5
Test1|Test9|Test10|Test11|Test12
Test1|Test13|Test14|Test15|Test16
Test17|Test18|Test19|Test20|Test21
Test17|Test22|Test23|Test24|Test5
... (2 Replies)
Hi ,
Some time i got duplicated value in my files ,
bundle_identifier= B
Sometext=ABC
bundle_identifier= A
bundle_unit=500
Sometext123=ABCD
bundle_unit=400
i need to check if there is a duplicated values or not if yes , i need to check if the value is A or B when Bundle_Identified ,... (2 Replies)
Dear community,
I have to remove duplicate lines from a file contains a very big ammount of rows (milions?) based on 1st and 3rd columns
The data are like this:
Region 23/11/2014 09:11:36 41752
Medio 23/11/2014 03:11:38 4132
Info 23/11/2014 05:11:09 4323... (2 Replies)
Dear folks
I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Hi,
My input looks like this (tab-delimited):
grp1 name2 firstname M 55 item1 item1.0
grp1 name2 firstname F 55 item1 item1.0
grp2 name1 firstname M 55 item1 item1.0
grp2 name2 firstname M 55 item1 item1.0
Using awk, I am trying to discard the records with common fields 2, 4, 5, 6, 7... (4 Replies)