Sponsored Content
Top Forums Shell Programming and Scripting PERL "filtering the log file removing the duplicates Post 302837537 by Subbeh on Friday 26th of July 2013 04:02:28 AM
Old 07-26-2013
scriptscript, did you try to do it yourself? With a little bit of research on how to open files and how to use variables in perl you could do it yourself:

Code:
#!/usr/bin/perl

my (%h, $k, $v);

open(my $fh, '<', '/path/to/file') or die "Unable to open file, $!";
while (<$fh>) {
        $_ =~ s/(type|state|TYPE)[0-9]/<$1>/;
        $h{$_}++;
}
close($fh);

while (($k, $v) = each %h) {
        print "$h{$k}\t$k"
}

 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

removing duplicates from a file

i have a file with some 1000 entries it will contain entries like 1000,ram 2000,pankaj 1001,rahim 1000,ram 2532,govind 2000,pankaj 3000,venkat 2532,govind what i want is i want to extract only the distinct rows from this file so my output should contain only 1000,ram... (2 Replies)
Discussion started by: trichyselva
2 Replies

2. Shell Programming and Scripting

Removing duplicates in a sorted file by field.

I have data like this: It's sorted by the 2nd field (TID). envoy,90000000000000634600010001,04/11/2008,23:19:27,RB00266,0015,DETAIL,ERROR, envoy,90000000000000634600010001,04/12/2008,04:23:45,RB00266,0015,DETAIL,ERROR,... (1 Reply)
Discussion started by: kinksville
1 Replies

3. UNIX for Dummies Questions & Answers

removing duplicates of a pattern from a file

hey all, I need some help. I have a text file with names in it. My target is that if a particular pattern exists in that file more than once..then i want to rename all the occurences of that pattern by alternate patterns.. for e.g if i have PATTERN occuring 5 times then i want to... (3 Replies)
Discussion started by: ashisharora
3 Replies

4. Shell Programming and Scripting

Removing duplicates from log file?

I have a log file with posts looking like this: -- Messages can be delivered by different systems at different times. The id number is used to sort out duplicate messages. What I need is to strip the arrival time from each post, sort posts by id number, and reattach arrival time to respective... (2 Replies)
Discussion started by: Ilja
2 Replies

5. Shell Programming and Scripting

Removing Duplicates from file

Hi Experts, Please check the following new requirement. I got data like the following in a file. FILE_HEADER 01cbbfde7898410| 3477945| home| 1 01cbc275d2c122| 3478234| WORK| 1 01cbbe4362743da| 3496386| Rich Spare| 1 01cbc275d2c122| 3478234| WORK| 1 This is pipe separated file with... (3 Replies)
Discussion started by: tinufarid
3 Replies

6. Shell Programming and Scripting

formatting a file and removing duplicates

Hi, I have a file that I want to change the format of. It is a large file in rows but I want it to be comma separated (comma then a space). The current file looks like this: HI, Joe, Bob, Jack, Jack After I would want to remove any duplicates so it would look like this: HI, Joe,... (2 Replies)
Discussion started by: kylle345
2 Replies

7. UNIX for Dummies Questions & Answers

Removing duplicates from a file

Hi All, I am merging files coming from 2 different systems ,while doing that I am getting duplicates entries in the merged file I,01,000131,764,2,4.00 I,01,000131,765,2,4.00 I,01,000131,772,2,4.00 I,01,000131,773,2,4.00 I,01,000168,762,2,2.00 I,01,000168,763,2,2.00... (5 Replies)
Discussion started by: Sri3001
5 Replies

8. Shell Programming and Scripting

Removing duplicates from new file

i hav two files like i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3 (2 Replies)
Discussion started by: sagar_1986
2 Replies

9. Shell Programming and Scripting

Removing duplicates from new file

i hav two files like i want to remove/delete all the duplicate lines in file2 which are viz unix,unix2,unix3.I have tried previous post also,but in that complete line must be similar.In this case i have to verify first column only regardless what is the content in succeeding columns. (3 Replies)
Discussion started by: sagar_1986
3 Replies

10. Shell Programming and Scripting

Removing duplicates on a single "column" (delimited file)

Hello ! I'm quite new to linux but haven't found a script to do this task, unfortunately my knowledge is quite limited on shellscripts... Could you guys help me removing the duplicate lines of a file, based only on a single "column"? For example: M202034357;01/2008;J30RJ021;Ciclo 01... (4 Replies)
Discussion started by: Rufinofr
4 Replies
PFUNC(1)						User Contributed Perl Documentation						  PFUNC(1)

NAME
pfunc - grep for perl functions SYNOPSIS
pfunc subroutine FILES... DESCRIPTION
pfunc searches the named FILES for all calls to the given subroutine. It will report back the file and line number each call is found on along with what sort of call it is function foo() class method Class->foo() object method $obj->foo() EXAMPLE
$ pfunc isa /usr/share/perl/5.6.1/*.pm Called as function in /usr/share/perl/5.6.1/CGI.pm at line 316 Called as function in /usr/share/perl/5.6.1/CGI.pm at line 327 Called as function in /usr/share/perl/5.6.1/CGI.pm at line 397 Called as function in /usr/share/perl/5.6.1/CGI.pm at line 494 Called as function in /usr/share/perl/5.6.1/CGI.pm at line 495 Called as object method in /usr/share/perl/5.6.1/CPAN.pm at line 4957 Called as function in /usr/share/perl/5.6.1/Dumpvalue.pm at line 191 Called as function in /usr/share/perl/5.6.1/Dumpvalue.pm at line 218 Called as function in /usr/share/perl/5.6.1/Dumpvalue.pm at line 248 Called as function in /usr/share/perl/5.6.1/Dumpvalue.pm at line 251 Called as function in /usr/share/perl/5.6.1/Dumpvalue.pm at line 254 Called as object method in /usr/share/perl/5.6.1/Shell.pm at line 28 Called as object method in /usr/share/perl/5.6.1/base.pm at line 12 NOTES
Its not fast, but its accurate. AUTHOR
Michael G Schwern <schwern@pobox.com> SEE ALSO
Module::Info perl v5.12.1 2002-12-05 PFUNC(1)
All times are GMT -4. The time now is 04:42 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy