Sponsored Content
Top Forums Shell Programming and Scripting Remove duplicates from end of file Post 302371046 by jsmithstl on Friday 13th of November 2009 08:04:00 AM
Old 11-13-2009
This perl script should do what you want.

Code:
#!/usr/bin/perl

use strict;

my %seen;     # hash used to determine if a line has been seen.
my @a_line;   # array to store unique lines.
my $line;     # variable to store each line.

#
# Loop through all lines in the file
#
while (<>)
{
   #
   # Ignore the line if it has been seen before.
   #
   unless ($seen{$_})
   {

      #
      # Increment the hash for the line.
      #
      $seen{$_}++;

      #
      # Store the line in the array of lines.
      #
      push( @a_line, $_);
   }
}

# 
# Print out each line in the array of lines.
#
foreach (@a_line)
{
   print "$_";
}

exit
#
# end of dup.pl
#

Sample Input Data

B
A
C
A
C

Output

./dup.pl < dup.dat
B
A
C

Hope this helps.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Remove duplicates from File from specific location

How can i remove the duplicate lines from a file, for example sample123456Sample testing123456testing XXXXX131323XXXXX YYYYY423432YYYYY fsdfdsf123456gsdfdsd all the duplicates from column 6-12 , must be deleted. I want to consider the first row, if same comes in the given range i want to... (1 Reply)
Discussion started by: gopikgunda
1 Replies

2. Shell Programming and Scripting

remove duplicates within a block in a file..help required

hi.. i have a file in the following format :- name-a age -12 address-123 age-12 phone-22222 ============ name-ab age -11 address-123 age-11 phone-222223 ============= name-abc age -12 address-1234 age-12 phone-2222223 ============= (2 Replies)
Discussion started by: nipun_garg
2 Replies

3. Shell Programming and Scripting

Shell script to remove duplicates lines in a file

Hi, I am writing a shell script that needs to remove duplicate lines within a file by category. example: section a a c b a section b a b a c I need to remove the duplicates within th category with out removing the duplicates from the 2 different sections (one of the a's in section... (1 Reply)
Discussion started by: RichElks
1 Replies

4. Shell Programming and Scripting

Remove duplicates from a file

Hi, I need to remove duplicates from a file. The file will be like this 0003 10101 20100120 abcdefghi 0003 10101 20100121 abcdefghi 0003 10101 20100122 abcdefghi 0003 10102 20100120 abcdefghi 0003 10103 20100120 abcdefghi 0003 10103 20100121 abcdefghi Here if the first colum and... (6 Replies)
Discussion started by: gpaulose
6 Replies

5. Shell Programming and Scripting

Search based on 1,2,4,5 columns and remove duplicates in the same file.

Hi, I am unable to search the duplicates in a file based on the 1st,2nd,4th,5th columns in a file and also remove the duplicates in the same file. Source filename: Filename.csv "1","ccc","information","5000","temp","concept","new" "1","ddd","information","6000","temp","concept","new"... (2 Replies)
Discussion started by: onesuri
2 Replies

6. Shell Programming and Scripting

How to remove duplicates from the .dat file

All, I have a file 1181CUSTOMER-L061411_003500.dat.Z having duplicate records in it. bash-2.05$ zcat 1181CUSTOMER-L061411_003500.dat.Z|grep "90876251S" 90876251S|ABG, AN ADAYANA COMPANY|3550 DEPAUW BLVD|||US|IN|INDIANAPOLIS||DAL|46268||||||GEN|||||||USD|||ABG, AN ADAYANA... (3 Replies)
Discussion started by: Oracle_User
3 Replies

7. UNIX for Dummies Questions & Answers

Remove duplicates from a file

Can u tell me how to remove duplicate records from a file? (11 Replies)
Discussion started by: saga20
11 Replies

8. UNIX for Dummies Questions & Answers

Remove duplicates and keep them in a separate file

Hi, I have a tablular separated file and I want to remove all the rows that have duplicates. The diuplicates I need to check are in column 13. I have tried to use awk but I have no Idea how to keep the duplicate file. awk 'FNR==NR{a++;next}(a> 1)' tomodify.txt tomodify.txt > new.txt ... (4 Replies)
Discussion started by: flacchy
4 Replies

9. Shell Programming and Scripting

To remove duplicates from pipe delimited file

Hi some one please help me to remove duplicates from a pipe delimited file based on first two columns. 123|asdf|sfsd|qwrer 431|yui|qwer|opws 123|asdf|pol|njio Here My first record and last record are duplicates.As per my requirement I want all the latest records into one file. I want the... (12 Replies)
Discussion started by: ginrkf
12 Replies

10. UNIX for Advanced & Expert Users

Remove duplicates in flat file

Hi all, I have a issues while loading a flat file to the DB. It is taking much time. When analyzed i found out that there are duplicates entry in the flat file. There are 2 type of Duplicate entry. 1) is entire row is duplicate. ( i can use sort | uniq) to remove the duplicated entry. 2) the... (4 Replies)
Discussion started by: samjoshuab
4 Replies
EMGRIP-DUPES(1) 					User Contributed Perl Documentation					   EMGRIP-DUPES(1)

NAME
emgrip-dupes - find packages listed in more than one component Synopsis Syntax: emgrip-dupes -b PATH [OPTIONS] emgrip-dupes -b PATH -m|--merge NAME [OPTIONS] emgrip-dupes -b PATH -p|--purge NAME [OPTIONS] emgrip-dupes -?|-h|--help|--version Commands: -b|--base-path PATH: path to the top level grip directory [required] -a|--arch ARCHITECTURE: architecture to test [default: i386] -m|--merge NAMES: retain this duplicate at the latest version in all -p|--purge NAMES: remove the duplicates from 'main' -t|--trim NAMES: retain the duplicates in main only -?|-h|--help|--version: print this help message and exit Options: --grip-name STRING: alternative name for the grip repository -s|--suite SUITE: suite to check (default: unstable) -n|--dry-run: print the reprepro commands that would be used. Description emgrip-dupes scans the Grip repository Packages data and configuration, identifies the supported list of components in the requested suite. In some cases, these duplicates are useful and only a small amount of space is taken up by the extra listing. However, the version in one component can easily be out of sync with the version in another. The main emphasis is on the size of the Packages file for the 'main' component (the one that every user needs to download). Purge mode will remove the listing of the specified package from 'main'. Merge mode will bring the outdated version into line with the most recent version of the package so that all components list the most recent version. Limitations Next step is to automate the "correction" of the duplicates but this does need care. Manual corrections involve identifying the packages to retain in main (where the duplicate in dev, doc or debug is not wanted) and pass those to --trim. The more complex case is to remove from main (e.g. package name suffix is -dev or -doc or -dbg or the Section is devel, dbg, doc or libdevel). emgrip-dupes --purge removes each binary separately because removing the package from main in a single operation will also remove the source. This is a particular problem if the source package also builds binary packages that are intended for main, e.g. dbus. Copyright and Licence Copyright (C) 2009 Neil Williams <codehelp@debian.org> This package is free software; you can redistribute it and/or modify it under the terms of the GNU General Public License as published by the Free Software Foundation; either version 3 of the License, or (at your option) any later version. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. You should have received a copy of the GNU General Public License along with this program. If not, see <http://www.gnu.org/licenses/>. perl v5.12.3 2011-03-27 EMGRIP-DUPES(1)
All times are GMT -4. The time now is 10:23 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy