Sponsored Content
Top Forums UNIX for Dummies Questions & Answers How to delete or remove duplicate lines in a file Post 302335536 by reva on Monday 20th of July 2009 01:30:43 AM
Old 07-20-2009
Question How to delete or remove duplicate lines in a file

Hi please help me how to remove duplicate lines in any file.
I have a file having huge number of lines.
i want to remove selected lines in it.
And also if there exists duplicate lines, I want to delete the rest & just keep one of them.
Please help me with any unix commands or even fortran program
for example
Code:
 SIG   50   12   0   34   87   3.00  37.0000N  100.0000E
 SIG   50   12   0   34   87   3.00  37.0000N  100.0000E  
SIG   18     7   9     0     0    0.00  36.0000N   60.0000E
SSR   40    7    0    0     0    0.00  35.2000N   60.4000E

Here i want the output to look like
Code:
SIG   50   12   0   34   87   3.00  37.0000N  100.0000E
SIG   18     7   9     0     0    0.00  36.0000N   60.0000E
SSR  40    7    0    0     0    0.00  35.2000N   60.4000E


Last edited by Yogesh Sawant; 07-20-2009 at 07:36 AM.. Reason: added code tags
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove Duplicate Lines in File

I am doing KSH script to remove duplicate lines in a file. Let say the file has format below. FileA 1253-6856 3101-4011 1827-1356 1822-1157 1822-1157 1000-1410 1000-1410 1822-1231 1822-1231 3101-4011 1822-1157 1822-1231 and I want to simply it with no duplicate line as file... (5 Replies)
Discussion started by: Teh Tiack Ein
5 Replies

2. UNIX for Dummies Questions & Answers

Remove Duplicate lines from File

I have a log file "logreport" that contains several lines as seen below: 04:20:00 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 06:38:08 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but responded to ping 07:11:05 /usr/lib/snmp/snmpdx: Agent snmpd appeared dead but... (18 Replies)
Discussion started by: Nysif Steve
18 Replies

3. Shell Programming and Scripting

delete semi-duplicate lines from file?

Ok here's what I'm trying to do. I need to get a listing of all the mountpoints on a system into a file, which is easy enough, just using something like "mount | awk '{print $1}'" However, on a couple of systems, they have some mount points looking like this: /stage /stand /usr /MFPIS... (2 Replies)
Discussion started by: paqman
2 Replies

4. UNIX for Dummies Questions & Answers

Delete duplicate lines and print to file

OK, I have read several things on how to do this, but can't make it work. I am writing this to a vi file then calling it as an awk script. So I need to search a file for duplicate lines, delete duplicate lines, then write the result to another file, say /home/accountant/files/docs/nodup ... (2 Replies)
Discussion started by: bfurlong
2 Replies

5. UNIX for Advanced & Expert Users

In a huge file, Delete duplicate lines leaving unique lines

Hi All, I have a very huge file (4GB) which has duplicate lines. I want to delete duplicate lines leaving unique lines. Sort, uniq, awk '!x++' are not working as its running out of buffer space. I dont know if this works : I want to read each line of the File in a For Loop, and want to... (16 Replies)
Discussion started by: krishnix
16 Replies

6. Shell Programming and Scripting

remove duplicate lines from file linux/sh

greetings, i'm hoping there is a way to cat a file, remove duplicate lines and send that output to a new file. the file will always vary but be something similar to this: please keep in mind that the above could be eight occurrences of each hostname or it might simply have another four of an... (2 Replies)
Discussion started by: crimso
2 Replies

7. Shell Programming and Scripting

How do I remove the duplicate lines in this file?

Hey guys, need some help to fix this script. I am trying to remove all the duplicate lines in this file. I wrote the following script, but does not work. What is the problem? The output file should only contain five lines: Later! (5 Replies)
Discussion started by: Ernst
5 Replies

8. Shell Programming and Scripting

Remove duplicate lines from a 50 MB file size

hi, Please help me to write a command to delete duplicate lines from a file. And the size of file is 50 MB. How to remove duplicate lins from such a big file. (6 Replies)
Discussion started by: vsachan
6 Replies

9. Shell Programming and Scripting

Remove duplicate lines from a file

Hi, I have a csv file which contains some millions of lines in it. The first line(Header) repeats at every 50000th line. I want to remove all the duplicate headers from the second occurance(should not remove the first line). I don't want to use any pattern from the Header as I have some... (7 Replies)
Discussion started by: sudhakar T
7 Replies

10. Shell Programming and Scripting

Remove duplicate lines, sort it and save it as file itself

Hi, all I have a csv file that I would like to remove duplicate lines based on 1st field and sort them by the 1st field. If there are more than 1 line which is same on the 1st field, I want to keep the first line of them and remove the rest. I think I have to use uniq or something, but I still... (8 Replies)
Discussion started by: refrain
8 Replies
Net::Server::SIG(3)					User Contributed Perl Documentation				       Net::Server::SIG(3)

NAME
Net::Server::SIG - adpf - Safer signal handling SYNOPSIS
use Net::Server::SIG qw(register_sig check_sigs); use IO::Select (); use POSIX qw(WNOHANG); my $select = IO::Select->new(); register_sig(PIPE => 'IGNORE', HUP => 'DEFAULT', USR1 => sub { print "I got a SIG $_[0] "; }, USR2 => sub { print "I got a SIG $_[0] "; }, CHLD => sub { 1 while waitpid(-1, WNOHANG) > 0; }, ); # add some handles to the select $select->add(*STDIN); # loop forever trying to stay alive while(1) { # do a timeout to see if any signals got passed us # while we were processing another signal my @fh = $select->can_read(10); my $key; my $val; # this is the handler for safe (fine under unsafe also) if (check_sigs()) { # or my @sigs = check_sigs(); next unless @fh; } my $handle = $fh[@fh]; # do something with the handle } DESCRIPTION
Signals prior in Perl prior to 5.7 were unsafe. Since then signals have been implemented in a more safe algorithm. Net::Server::SIG provides backwards compatibility, while still working reliably with newer releases. Using a property of the select() function, Net::Server::SIG attempts to fix the unsafe problem. If a process is blocking on select() any signal will short circuit the select. Using this concept, Net::Server::SIG does the least work possible (changing one bit from 0 to 1). And depends upon the actual processing of the signals to take place immediately after the the select call via the "check_sigs" function. See the example shown above and also see the sigtest.pl script located in the examples directory of this distribution. FUNCTIONS
"register_sig($SIG => &code_ref)" Takes key/value pairs where the key is the signal name, and the argument is either a code ref, or the words 'DEFAULT' or 'IGNORE'. The function register_sig must be used in conjuction with check_sigs, and with a blocking select() function call -- otherwise, you will observe the registered signal mysteriously vanish. "unregister_sig($SIG)" Takes the name of a signal as an argument. Calls register_sig with a this signal name and 'DEFAULT' as arguments (same as register_sig(SIG,'DEFAULT') "check_sigs()" Checks to see if any registered signals have occured. If so, it will play the registered code ref for that signal. Return value is array containing any SIGNAL names that had occured. "sig_is_registered($SIG)" Takes a signal name and returns any registered code_ref for that signal. AUTHORS
Paul Seamons (paul@seamons.com) Rob B Brown (rob@roobik.com) - Provided a sounding board and feedback in creating Net::Server::SIG and sigtest.pl. LICENSE
This package may be distributed under the terms of either the GNU General Public License or the Perl Artistic License All rights reserved. perl v5.16.2 2012-05-29 Net::Server::SIG(3)
All times are GMT -4. The time now is 06:04 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy