Sponsored Content
Full Discussion: awk remove first duplicates
Top Forums Shell Programming and Scripting awk remove first duplicates Post 302885764 by RavinderSingh13 on Tuesday 28th of January 2014 09:48:25 AM
Old 01-28-2014
Hello,

Following may help.

Code:
awk 'NR==1 {print} f ~ $3 && i == 0 {i++;} f ~ $3 && i > 0 {print $0;i=0;j=1} f !~ $3 && j==1  {print $0} {f=$3;}'  file_name


Output will be as follows.

Code:
12  NIL ABD LON
12  NIL ABC AMR
13  NIL ABC AMR
11  NIL ABK AMR


NOTE: It will work for only this particular Input.


Thanks,
R. Singh

Last edited by RavinderSingh13; 01-28-2014 at 11:06 AM.. Reason: added a note
This User Gave Thanks to RavinderSingh13 For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

How to remove duplicates without sorting

Hello, I can remove duplicate entries in a file by: sort File1 | uniq > File2 but how can I remove duplicates without sorting the file? I tried cat File1 | uniq > File2 but it doesn't work thanks (4 Replies)
Discussion started by: orahi001
4 Replies

2. Shell Programming and Scripting

Remove duplicates

Hello Experts, I have two files named old and new. Below are my example files. I need to compare and print the records that only exist in my new file. I tried the below awk script, this script works perfectly well if the records have exact match, the issue I have is my old file has got extra... (4 Replies)
Discussion started by: forumthreads
4 Replies

3. Shell Programming and Scripting

remove duplicates and sort

Hi, I'm using the below command to sort and remove duplicates in a file. But, i need to make this applied to the same file instead of directing it to another. Thanks (6 Replies)
Discussion started by: dvah
6 Replies

4. Shell Programming and Scripting

bash - remove duplicates

I need to use a bash script to remove duplicate files from a download list, but I cannot use uniq because the urls are different. I need to go from this: http://***/fae78fe/file1.wmv http://***/39du7si/file1.wmv http://***/d8el2hd/file2.wmv http://***/h893js3/file2.wmv to this: ... (2 Replies)
Discussion started by: locoroco
2 Replies

5. Shell Programming and Scripting

Awk: Remove Duplicates

I have the following code for removing duplicate records based on fields in inputfile file & moves the duplicate records in duplicates file(1st Awk) & in 2nd awk i fetch the non duplicate entries in inputfile to tmp file and use move to update the original file. Requirement: Can both the awk... (4 Replies)
Discussion started by: siramitsharma
4 Replies

6. Shell Programming and Scripting

Remove duplicates

I have a file with the following format: fields seperated by "|" title1|something class|long...content1|keys title2|somhing class|log...content1|kes title1|sothing class|lon...content1|kes title3|shing cls|log...content1|ks I want to remove all duplicates with the same "title field"(the... (3 Replies)
Discussion started by: dtdt
3 Replies

7. Shell Programming and Scripting

Remove top 3 duplicates

hello , I have a requirement with input in below format abc 123 xyz bcd 365 kii abc 987 876 cdf 987 uii abc 456 yuu bcd 654 rrr Expecting Output abc 456 yuu bcd 654 rrr cdf 987 uii (1 Reply)
Discussion started by: Tomlight
1 Replies

8. Shell Programming and Scripting

Sort and Remove duplicates

Here is my task : I need to sort two input files and remove duplicates in the output files : Sort by 13 characters from 97 Ascending Sort by 1 characters from 96 Ascending If duplicates are found retain the first value in the file the input files are variable length, convert... (4 Replies)
Discussion started by: ysvsr1
4 Replies

9. Shell Programming and Scripting

Remove duplicates

Hi I have a below file structure. 200,1245,E1,1,E1,,7611068,KWH,30, ,,,,,,,, 200,1245,E1,1,E1,,7611070,KWH,30, ,,,,,,,, 300,20140223,0.001,0.001,0.001,0.001,0.001 300,20140224,0.001,0.001,0.001,0.001,0.001 300,20140225,0.001,0.001,0.001,0.001,0.001 300,20140226,0.001,0.001,0.001,0.001,0.001... (1 Reply)
Discussion started by: tejashavele
1 Replies

10. Shell Programming and Scripting

awk - Remove duplicates during array build

Greetings Experts, Issue: Within awk script, remove the duplicate occurrences that are space (1 single space character) separated Description: I am processing 2 files using awk and during processing, I am building an array and there are duplicates on this; how can I delete the duplicates... (3 Replies)
Discussion started by: chill3chee
3 Replies
IO::Async::Signal(3pm)					User Contributed Perl Documentation				    IO::Async::Signal(3pm)

NAME
"IO::Async::Signal" - event callback on receipt of a POSIX signal SYNOPSIS
use IO::Async::Signal; use IO::Async::Loop; my $loop = IO::Async::Loop->new; my $signal = IO::Async::Signal->new( name => "HUP", on_receipt => sub { print "I caught SIGHUP "; }, ); $loop->add( $signal ); $loop->run; DESCRIPTION
This subclass of IO::Async::Notifier invokes its callback when a particular POSIX signal is received. Multiple objects can be added to a "Loop" that all watch for the same signal. The callback functions will all be invoked, in no particular order. EVENTS
The following events are invoked, either using subclass methods or CODE references in parameters: on_receipt Invoked when the signal is received. PARAMETERS
The following named parameters may be passed to "new" or "configure": name => STRING The name of the signal to watch. This should be a bare name like "TERM". Can only be given at construction time. on_receipt => CODE CODE reference for the "on_receipt" event. Once constructed, the "Signal" will need to be added to the "Loop" before it will work. AUTHOR
Paul Evans <leonerd@leonerd.org.uk> perl v5.14.2 2012-10-24 IO::Async::Signal(3pm)
All times are GMT -4. The time now is 06:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy