Sponsored Content
Top Forums UNIX for Dummies Questions & Answers script to remove duplicates per line Post 302705971 by spacebar on Tuesday 25th of September 2012 11:01:32 PM
Old 09-26-2012
See if the below code will work for you:
Code:
$ cat t
(56)(63)
(56)(70)(56)(70)(24)
(25)(78)
(12)(33)(12)
(10)
(10)

$ while read l
> do
>   echo "$l" | sed -e 's/)(/)\n(/g' -e 's/)$/)\n/' | awk '!x[$0]++' >> t2
> done <t

$ awk  'BEGIN { RS = ""; OFS = ""} {$1 = $1; print }' t2
(56)(63)
(56)(70)(24)
(25)(78)
(12)(33)
(10)
(10)

This User Gave Thanks to spacebar For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

2. Shell Programming and Scripting

Shell script to remove duplicates lines in a file

Hi, I am writing a shell script that needs to remove duplicate lines within a file by category. example: section a a c b a section b a b a c I need to remove the duplicates within th category with out removing the duplicates from the 2 different sections (one of the a's in section... (1 Reply)
Discussion started by: RichElks
1 Replies

3. Shell Programming and Scripting

Script to remove duplicates

Hi I need a script that removes the duplicate records and write it to a new file for example I have a file named test.txt and it looks like abcd.23 abcd.24 abcd.25 qwer.25 qwer.26 qwer.98 I want to pick only $1 and compare with the next record and the output should be abcd.23... (6 Replies)
Discussion started by: antointoronto
6 Replies

4. Shell Programming and Scripting

delete from line and remove duplicates

My Input.....file1 ABCDE4435 Connected to 107.71.136.122 (SubNetwork=ONRM_RootMo_R SubNetwork=XYVLTN29CRBR99 MeContext=ABCDE4435 ManagedElement=1) ABCDE4478 Connected to 166.208.30.57 (SubNetwork=ONRM_RootMo_R SubNetwork=KLFMTN29CR0R04 MeContext=ABCDE4478 ManagedElement=1) ABCDE4478... (5 Replies)
Discussion started by: pareshkp
5 Replies

5. Shell Programming and Scripting

Awk: Remove Duplicates

I have the following code for removing duplicate records based on fields in inputfile file & moves the duplicate records in duplicates file(1st Awk) & in 2nd awk i fetch the non duplicate entries in inputfile to tmp file and use move to update the original file. Requirement: Can both the awk... (4 Replies)
Discussion started by: siramitsharma
4 Replies

6. Shell Programming and Scripting

awk remove first duplicates

Hi All, I have searched many threads for possible close solution. But I was unable to get simlar scenario. I would like to print all duplicate based on 3rd column except the first occurance. Also would like to print if it is single entry(non-duplicate). i/P file 12 NIL ABD LON 11 NIL ABC... (6 Replies)
Discussion started by: sybadm
6 Replies

7. Shell Programming and Scripting

Help with merge and remove duplicates

Hi all, I need some help to remove duplicates from a file before merging. I have got 2 files: file1 has data in format 4300 23456 4301 2357 the 4 byte values on the right hand side is uniq, and are not repeated anywhere in the file file 2 has data in same format but is not in... (10 Replies)
Discussion started by: roy121
10 Replies

8. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

9. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies

10. Shell Programming and Scripting

How to remove duplicates using for loop?

values=(1 2 3 5 4 2 3 1 6 8 3 5 ) #i need the output like this by removing the duplicates 1 2 3 5 4 6 8 #i dont need sorting in my program #plz explain me as simple using for loop #os-ubuntu ,shell=bash (5 Replies)
Discussion started by: Meeran Rizvi
5 Replies
EMGRIP-DUPES(1) 					User Contributed Perl Documentation					   EMGRIP-DUPES(1)

NAME
emgrip-dupes - find packages listed in more than one component Synopsis Syntax: emgrip-dupes -b PATH [OPTIONS] emgrip-dupes -b PATH -m|--merge NAME [OPTIONS] emgrip-dupes -b PATH -p|--purge NAME [OPTIONS] emgrip-dupes -?|-h|--help|--version Commands: -b|--base-path PATH: path to the top level grip directory [required] -a|--arch ARCHITECTURE: architecture to test [default: i386] -m|--merge NAMES: retain this duplicate at the latest version in all -p|--purge NAMES: remove the duplicates from 'main' -t|--trim NAMES: retain the duplicates in main only -?|-h|--help|--version: print this help message and exit Options: --grip-name STRING: alternative name for the grip repository -s|--suite SUITE: suite to check (default: unstable) -n|--dry-run: print the reprepro commands that would be used. Description emgrip-dupes scans the Grip repository Packages data and configuration, identifies the supported list of components in the requested suite. In some cases, these duplicates are useful and only a small amount of space is taken up by the extra listing. However, the version in one component can easily be out of sync with the version in another. The main emphasis is on the size of the Packages file for the 'main' component (the one that every user needs to download). Purge mode will remove the listing of the specified package from 'main'. Merge mode will bring the outdated version into line with the most recent version of the package so that all components list the most recent version. Limitations Next step is to automate the "correction" of the duplicates but this does need care. Manual corrections involve identifying the packages to retain in main (where the duplicate in dev, doc or debug is not wanted) and pass those to --trim. The more complex case is to remove from main (e.g. package name suffix is -dev or -doc or -dbg or the Section is devel, dbg, doc or libdevel). emgrip-dupes --purge removes each binary separately because removing the package from main in a single operation will also remove the source. This is a particular problem if the source package also builds binary packages that are intended for main, e.g. dbus. Copyright and Licence Copyright (C) 2009 Neil Williams <codehelp@debian.org> This package is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see <http://www.gnu.org/licenses/>. perl v5.12.3 2011-03-27 EMGRIP-DUPES(1)
All times are GMT -4. The time now is 09:34 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy