Sponsored Content
Top Forums UNIX for Dummies Questions & Answers script to remove duplicates per line Post 302705993 by pamu on Wednesday 26th of September 2012 01:49:35 AM
Old 09-26-2012
try this..

Code:
awk -F "[()]" '{ for(i=2;i<=NF;i+=2){if(!X[$i]++){printf "("$i")"}}{delete X;print ""}}' file

This User Gave Thanks to pamu For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

2. Shell Programming and Scripting

Shell script to remove duplicates lines in a file

Hi, I am writing a shell script that needs to remove duplicate lines within a file by category. example: section a a c b a section b a b a c I need to remove the duplicates within th category with out removing the duplicates from the 2 different sections (one of the a's in section... (1 Reply)
Discussion started by: RichElks
1 Replies

3. Shell Programming and Scripting

Script to remove duplicates

Hi I need a script that removes the duplicate records and write it to a new file for example I have a file named test.txt and it looks like abcd.23 abcd.24 abcd.25 qwer.25 qwer.26 qwer.98 I want to pick only $1 and compare with the next record and the output should be abcd.23... (6 Replies)
Discussion started by: antointoronto
6 Replies

4. Shell Programming and Scripting

delete from line and remove duplicates

My Input.....file1 ABCDE4435 Connected to 107.71.136.122 (SubNetwork=ONRM_RootMo_R SubNetwork=XYVLTN29CRBR99 MeContext=ABCDE4435 ManagedElement=1) ABCDE4478 Connected to 166.208.30.57 (SubNetwork=ONRM_RootMo_R SubNetwork=KLFMTN29CR0R04 MeContext=ABCDE4478 ManagedElement=1) ABCDE4478... (5 Replies)
Discussion started by: pareshkp
5 Replies

5. Shell Programming and Scripting

Awk: Remove Duplicates

I have the following code for removing duplicate records based on fields in inputfile file & moves the duplicate records in duplicates file(1st Awk) & in 2nd awk i fetch the non duplicate entries in inputfile to tmp file and use move to update the original file. Requirement: Can both the awk... (4 Replies)
Discussion started by: siramitsharma
4 Replies

6. Shell Programming and Scripting

awk remove first duplicates

Hi All, I have searched many threads for possible close solution. But I was unable to get simlar scenario. I would like to print all duplicate based on 3rd column except the first occurance. Also would like to print if it is single entry(non-duplicate). i/P file 12 NIL ABD LON 11 NIL ABC... (6 Replies)
Discussion started by: sybadm
6 Replies

7. Shell Programming and Scripting

Help with merge and remove duplicates

Hi all, I need some help to remove duplicates from a file before merging. I have got 2 files: file1 has data in format 4300 23456 4301 2357 the 4 byte values on the right hand side is uniq, and are not repeated anywhere in the file file 2 has data in same format but is not in... (10 Replies)
Discussion started by: roy121
10 Replies

8. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

9. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies

10. Shell Programming and Scripting

How to remove duplicates using for loop?

values=(1 2 3 5 4 2 3 1 6 8 3 5 ) #i need the output like this by removing the duplicates 1 2 3 5 4 6 8 #i dont need sorting in my program #plz explain me as simple using for loop #os-ubuntu ,shell=bash (5 Replies)
Discussion started by: Meeran Rizvi
5 Replies
SFOOD-CHECKER(1)					      General Commands Manual						  SFOOD-CHECKER(1)

NAME
sfood-checker - check for superfluous import statements in Python source code SYNOPSIS
sfood-checker [options] files... DESCRIPTION
This script is used to detect forgotten imports that are not used anymore. When writing Python code (which happens so fast), it is often the case that we forget to remove useless imports. This is implemented using a search in the AST, and as such we do not require to import the module in order to run the checks. This is a major advantage over all the other lint/checker programs, and the main reason for taking the time to write it. As inputs, it can receive either files or directories; in case no argument is passed, it parses the current directory recursively. OPTIONS
-h, --help show the help message and exit --debug Debugging output. -I IGNORES, --ignore=IGNORES Add the given directory name to the list to be ignored. -d, --disable-pragmas Disable processing of pragma directives as strings after imports. -D, --duplicates, --enable-duplicates Enable experimental heuristic for finding duplicate imports. -M, --missing, --enable-missing Enable experimental heuristic for finding missing imports. SEE ALSO
sfood(1), sfood-cluster(1), sfood-copy(1), sfood-flatten(1), sfood-graph(1), sfood-imports(1). AUTHOR
sfood-checker was written by Martin Blais <blais@furius.ca> and it's part of snakefood suite. This manual page was written by Sandro Tosi <morph@debian.org>, for the Debian project (and may be used by others). January 2, 2009 SFOOD-CHECKER(1)
All times are GMT -4. The time now is 06:21 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy