Sponsored Content
Top Forums Shell Programming and Scripting Perl: filtering lines based on duplicate values in a column Post 302558152 by rangarasan on Friday 23rd of September 2011 01:48:21 AM
Old 09-23-2011
PERL

Hi,

Try this code,

Code:
#! /usr/local/bin/perl
open(FILE,"<File1") or die("unable to open file");
my @mContent = <FILE>;
my %mFinal = ();
foreach ( @mContent )
{
   my $mLine = $_;
   chomp ( $mLine );
   my $mField = (split(/ /,$mLine,999))[0];
   $mFinal{$mField}{"count"}=$mFinal{$mField}{"count"}+1;
   $mFinal{$mField}{"content"}=$mLine;
}
foreach my $mField ( keys %mFinal )
{
   my $mCount = $mFinal{$mField}{"count"};
   if ( $mCount != 10 )
   {
      print "$mFinal{$mField}{'content'}\n";
   }
}


Cheers,
RangaSmilie
This User Gave Thanks to rangarasan For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Filtering duplicate lines

Does anybody know a command that filters duplicate lines out of a file. Similar to the uniq command but can handle duplicate lines no matter where they occur in a file? (9 Replies)
Discussion started by: AreaMan
9 Replies

2. Shell Programming and Scripting

Joining multiple files based on one column with different and similar values (shell or perl)

Hi, I have nine files looking similar to file1 & file2 below. File1: 1 ABCA1 1 ABCC8 1 ABR:N 1 ACACB 1 ACAP2 1 ACOT1 1 ACSBG 1 ACTR1 1 ACTRT 1 ADAMT 1 AEN:N 1 AKAP1File2: 1 A4GAL 1 ACTBL 1 ACTL7 (4 Replies)
Discussion started by: seqbiologist
4 Replies

3. Shell Programming and Scripting

Filtering lines for column elements based on corresponding counts in another column

Hi, I have a file like this ACC 2 2 21 aaa AC 443 3 22 aaa GCT 76 1 33 xxx TCG 34 2 33 aaa ACGT 33 1 22 ggg TTC 99 3 44 wee CCA 33 2 33 ggg AAC 1 3 55 ddd TTG 10 1 22 ddd TTGC 98 3 22 ddd GCT 23 1 21 sds GTC 23 4 32 sds ACGT 32 2 33 vvv CGT 11 2 33 eee CCC 87 2 44... (1 Reply)
Discussion started by: polsum
1 Replies

4. UNIX for Dummies Questions & Answers

[SOLVED] remove lines that have duplicate values in column two

Hi, I've got a file that I'd like to uniquely sort based on column 2 (values in column 2 begin with "comp"). I tried sort -t -nuk2,3 file.txtBut got: sort: multi-character tab `-nuk2,3' "man sort" did not help me out Any pointers? Input: Output: (5 Replies)
Discussion started by: pathunkathunk
5 Replies

5. UNIX for Dummies Questions & Answers

awk solution to duplicate lines based on column

Hi experts, I have a tab-delimited file with one column containing values separated by a comma. I wish to duplicate the entire line for every value in that comma-delimited field. For example: $cat file 4444 4444 4444 4444 9990 2222,7777 6666 2222 ... (3 Replies)
Discussion started by: torchij
3 Replies

6. Shell Programming and Scripting

awk to sum a column based on duplicate strings in another column and show split totals

Hi, I have a similar input format- A_1 2 B_0 4 A_1 1 B_2 5 A_4 1 and looking to print in this output format with headers. can you suggest in awk?awk because i am doing some pattern matching from parent file to print column 1 of my input using awk already.Thanks! letter number_of_letters... (5 Replies)
Discussion started by: prashob123
5 Replies

7. Shell Programming and Scripting

Removing duplicate lines on first column based with pipe delimiter

Hi, I have tried to remove dublicate lines based on first column with pipe delimiter . but i ma not able to get some uniqu lines Command : sort -t'|' -nuk1 file.txt Input : 38376KZ|09/25/15|1.057 38376KZ|09/25/15|1.057 02006YB|09/25/15|0.859 12593PS|09/25/15|2.803... (2 Replies)
Discussion started by: parithi06
2 Replies

8. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies

9. UNIX for Beginners Questions & Answers

Filtering based on column values

Hi there, I am trying to filter a big file with several columns using values on a column with values like (AC=5;AN=10;SF=341,377,517,643,662;VRT=1). I wont to filter the data based on SF= values that are (bigger than 400) ... (25 Replies)
Discussion started by: daashti
25 Replies

10. UNIX for Beginners Questions & Answers

Find lines with duplicate values in a particular column

I have a file with 5 columns. I want to pull out all records where the value in column 4 is not unique. For example in the sample below, I would want it to print out all lines except for the last two. 40991764 2419 724 47182 Cand A 40992936 3591 724 47182 Cand B 40993016 3671 724 47182 Cand C... (5 Replies)
Discussion started by: kaktus
5 Replies
Devel::Loaded(3pm)					User Contributed Perl Documentation					Devel::Loaded(3pm)

NAME
Loaded - Post-execution dump of loaded modules SYNOPSIS
perl -MDevel::Loaded -Sc programname 2>/dev/null DESCRIPTION
The Devel::Loaded module installs an at-exit handler to generate a dump of all the module files used by a Perl program. If used in conjunction with a perl -c, you find those files loaded in at compile time with "use". If you are willing to wait until after the program runs, you can get them all. EXAMPLES
This is compile-time only: $ perl -MDevel::Loaded -S -c perldoc 2>/dev/null /usr/local/devperl/lib/5.00554/Exporter.pm /usr/local/devperl/lib/5.00554/strict.pm /usr/local/devperl/lib/5.00554/vars.pm /usr/local/devperl/lib/5.00554/i686-linux/Config.pm /usr/local/devperl/lib/5.00554/Getopt/Std.pm This will also catch run-time loads: #!/usr/bin/perl use Devel::Loaded; ... SEE ALSO
The plxload and the pmload programs, which use this technique. AUTHORS and COPYRIGHTS Copyright (C) 1999 Tom Christiansen. Copyright (C) 2006-2008 Mark Leighton Fisher. This is free software; you can redistribute it and/or modify it under the terms of either: (a) the GNU General Public License as published by the Free Software Foundation; either version 1, or (at your option) any later version, or (b) the Perl "Artistic License". (This is the Perl 5 licensing scheme.) Please note this is a change from the original pmtools-1.00 (still available on CPAN), as pmtools-1.00 were licensed only under the Perl "Artistic License". perl v5.10.1 2010-02-22 Devel::Loaded(3pm)
All times are GMT -4. The time now is 06:48 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy