Sponsored Content
Top Forums Shell Programming and Scripting Find duplicate value and create an Post 302170716 by ricky007 on Tuesday 26th of February 2008 02:13:41 PM
Old 02-26-2008
Find duplicate value and create an

I have compiled the following works fine, but I am getting IP and device all together. I need to separate them, I need only the duplicate device name and then next paragraph duplicate IP only:

#!/opt/sa/bin/perl
use strict;
use warnings;
use MIME::Lite;
my (%ip, %host, $duplicates);
my $host_file = '/etc/hosts';
open my $file, '<', $host_file or die "can't open $host_file $!";
while (<$file>) {
if( my ($ip, $host) = /^#?([\d.]+)\s+(\S+)/ ) {
if ( defined $ip{$ip} or defined $ip{$host} ) {
$duplicates .= $_;
}
else {
$ip{$ip}++;
$ip{$host}++;
}
}
}
close $file;
my $email_msg = <<EMAIL_MSG;
The following entries in the host file are dulpicates
either by IP address or by hostname.
$duplicates
EMAIL_MSG
my $email = MIME::Lite->new(
From => 'xx@xx.com',
To => 'xv@xv.com',
Cc => '33@xx.com,42@xx.com',
Subject => 'Host file duplicates',
Data => $email_msg
);
$email->send;


Out put of above script:

119.76.169.15 aaaa
129.76.169.2 bcd
#192.16.5.30 headm
#79.158.221.4 tailm




I need output like follwoing when conditions are true:

Duplcate IP found:

139.76.169.15
129.76.169.2


Duplicate host/device found

headm
tailm

Last edited by ricky007; 03-03-2008 at 11:10 AM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to find duplicate files with find ?

hello all I like to make search on files , and the result need to be the files that are duplicated? (8 Replies)
Discussion started by: umen
8 Replies

2. Shell Programming and Scripting

Find duplicate value comparing 2 files and create an output

I need a perl script which will create an output file after comparing two diff file in a directory path: /export/home/abc/file1 /export/home/abc/file2 File Format: <IP>TAB<DeviceName><TAB>DESCRIPTIONS file1: 10.1.2.1.3<tab>abc123def<tab>xyz.mm1.ppp.... (2 Replies)
Discussion started by: ricky007
2 Replies

3. Shell Programming and Scripting

find duplicate records... again

Hi all: Let's suppose I have a file like this (but with many more records). XX ME 342 8688 2006 7 6 3c 60.029 -38.568 2901 0001 74 4 7603 8 969.8 958.4 3.6320 34.8630 985.5 973.9 3.6130 34.8600 998.7 986.9 3.6070 34.8610 1003.6 991.7 ... (4 Replies)
Discussion started by: rleal
4 Replies

4. Shell Programming and Scripting

Find Duplicate files, not by name

I have a directory with images: -rw-r--r-- 1 root root 26216 Mar 19 21:00 020109.210001.jpg -rw-r--r-- 1 root root 21760 Mar 19 21:15 020109.211502.jpg -rw-r--r-- 1 root root 23144 Mar 19 21:30 020109.213002.jpg -rw-r--r-- 1 root root 31350 Mar 20 00:45 020109.004501.jpg -rw-r--r-- 1 root... (2 Replies)
Discussion started by: Ikon
2 Replies

5. Shell Programming and Scripting

Find duplicate files

What utility do you recommend for simply finding all duplicate files among all files? (4 Replies)
Discussion started by: kiasas
4 Replies

6. Shell Programming and Scripting

Create duplicate directories with same permissions

Hi all, I need to create duplicate directories and sub directories (only the directories not the files or file contents) with the same permissions. Can some one guide me in doing this. I could able to create but here the permissions should be the same how can i do this in linux. Thanks in... (5 Replies)
Discussion started by: Olivia
5 Replies

7. Shell Programming and Scripting

Find duplicate based on 'n' fields and mark the duplicate as 'D'

Hi, In a file, I have to mark duplicate records as 'D' and the latest record alone as 'C'. In the below file, I have to identify if duplicate records are there or not based on Man_ID, Man_DT, Ship_ID and I have to mark the record with latest Ship_DT as "C" and other as "D" (I have to create... (7 Replies)
Discussion started by: machomaddy
7 Replies

8. Shell Programming and Scripting

How to find duplicate entries

I have a file contails as below I/P: 123456 123456 234567 987654 678905 678905 Like above i have 1000's of entries I need output as below O/P: 123456 678905 I'm using uniq -d filename it is showing results but it is missing few duplicate entries and i dont know why.Please... (9 Replies)
Discussion started by: buzzme
9 Replies

9. Shell Programming and Scripting

Usage of find and cp with duplicate

Hi All ! I am trying to copy all files with extension .sh to one folder, following command I am using find . -name \*.sh -print0 | xargs -I{} -0 cp -v {} Scripts/ above command working fine but I have some .sh file with same base name different directory, so I would copy all .sh file including... (5 Replies)
Discussion started by: Akshay Hegde
5 Replies

10. Shell Programming and Scripting

Find duplicate values in specific column and delete all the duplicate values

Dear folks I have a map file of around 54K lines and some of the values in the second column have the same value and I want to find them and delete all of the same values. I looked over duplicate commands but my case is not to keep one of the duplicate values. I want to remove all of the same... (4 Replies)
Discussion started by: sajmar
4 Replies
FDUPES(1)						      General Commands Manual							 FDUPES(1)

NAME
fdupes - finds duplicate files in a given set of directories SYNOPSIS
fdupes [ options ] DIRECTORY ... DESCRIPTION
Searches the given path for duplicate files. Such files are found by comparing file sizes and MD5 signatures, followed by a byte-by-byte comparison. OPTIONS
-r --recurse for every directory given follow subdirectories encountered within -R --recurse: for each directory given after this option follow subdirectories encountered within (note the ':' at the end of option; see the Examples section below for further explanation) -s --symlinks follow symlinked directories -H --hardlinks normally, when two or more files point to the same disk area they are treated as non-duplicates; this option will change this behav- ior -n --noempty exclude zero-length files from consideration -f --omitfirst omit the first file in each set of matches -A --nohidden exclude hidden files from consideration -1 --sameline list each set of matches on a single line -S --size show size of duplicate files -m --summarize summarize duplicate files information -q --quiet hide progress indicator -d --delete prompt user for files to preserve, deleting all others (see CAVEATS below) -N --noprompt when used together with --delete, preserve the first file in each set of duplicates and delete the others without prompting the user -v --version display fdupes version -h --help displays help SEE ALSO
md5sum(1) NOTES
Unless -1 or --sameline is specified, duplicate files are listed together in groups, each file displayed on a separate line. The groups are then separated from each other by blank lines. When -1 or --sameline is specified, spaces and backslash characters () appearing in a filename are preceded by a backslash character. EXAMPLES
fdupes a --recurse: b will follow subdirectories under b, but not those under a. fdupes a --recurse b will follow subdirectories under both a and b. CAVEATS
If fdupes returns with an error message such as fdupes: error invoking md5sum it means the program has been compiled to use an external program to calculate MD5 signatures (otherwise, fdupes uses internal routines for this purpose), and an error has occurred while attempting to execute it. If this is the case, the specified program should be properly installed prior to running fdupes. When using -d or --delete, care should be taken to insure against accidental data loss. When used together with options -s or --symlink, a user could accidentally preserve a symlink while deleting the file it points to. Furthermore, when specifying a particular directory more than once, all files within that directory will be listed as their own duplicates, leading to data loss should a user preserve a file without its "duplicate" (the file itself!). AUTHOR
Adrian Lopez <adrian2@caribe.net> FDUPES(1)
All times are GMT -4. The time now is 02:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy