02-25-2019
To see which files will be deleted
fdups -f
This will display the entire list without the top files in each section.
Well, to remove the displayed list, use
fdups -Nd
But in your case I would use an interactive way. At least until you get acquainted with the subtleties of this tool.
Good luck
--- Post updated at 15:28 ---
Quote:
Originally Posted by
nezabudka
diff utility with such files should be silent
But it is not a fact if you compare binary files. The real is that the "fdupes" utility even works with binary files and "diff" is a text tool
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I want to find duplicates in file on 2nd field i wrote this code:
nawk '{a++} END{for i in a {if (a>1) print}}' temp
Could not find whats wrong with this.
Appreciate help (5 Replies)
Discussion started by: pinnacle
5 Replies
2. Shell Programming and Scripting
I have a folder which in turn has numerous sub folders all containing pdf files with same file named in different ways.
So I need a script if it can be written to find and print the duplicate files (That is files with same size) along with the respective paths.
So I assume here that same file... (5 Replies)
Discussion started by: deaddevil
5 Replies
3. Shell Programming and Scripting
Hi All,
I am Oracle Apps Tech guy, I have a requirement to find 777 permission is there or not for all Folders and Sub-folders
Under APPL_TOP (Folder/directory) with below conditions
i) the directory names should start with xx..... (like xxau,xxcfi,xxcca...etc)
and exclude the directory... (11 Replies)
Discussion started by: gagan4599
11 Replies
4. Shell Programming and Scripting
Hello,
My text file has input of the form
abc dft45.xml
ert rt653.xml
abc ert57.xml
I need to write a perl script/shell script to find duplicates in the first column and write it into a text file of the form...
abc dft45.xml
abc ert57.xml
Can some one help me plz? (5 Replies)
Discussion started by: gameboy87
5 Replies
5. UNIX for Dummies Questions & Answers
example data
5666700842511TAfmoham03151008075205999900000001000001000++
5666700843130MAfmoham03151008142606056667008390315100005001
6666666663130MAfmoham03151008142606056667008390315100005001
I'd like to sort on position 10-14 where the characters are eq "130MA".
Then based on positions... (0 Replies)
Discussion started by: mmarshall
0 Replies
6. UNIX for Dummies Questions & Answers
Hi all,
Using grep command, i want to find the pattern of text in all directories and sub-directories.
e.g: if i want to search for a pattern named "parmeter", i used the command
grep -i "param" ../*
is this correct? (1 Reply)
Discussion started by: vinothrajan55
1 Replies
7. Shell Programming and Scripting
these are numeric ids..
222932017099186177
222932014385467392
222932017371820032
222932017409556480
I have text file having 300 millions of line as shown above. I want to find duplicates from this file. Please suggest the quicker way..
sort | uniq -d will... (3 Replies)
Discussion started by: pamu
3 Replies
8. Shell Programming and Scripting
Hi All,
Input.txt
123,ABC,XYZ1,A01,IND,I68,IND,NN
123,ABC,XYZ1,A01,IND,I67,IND,NN
998,SGR,St,R834,scot,R834,scot,NN
985,SGR0399,St,R180,T15,R180,T1,YY
985,SGR0399,St,R180,T15,R180,T1,NN
985,SGR0399,St,R180,T15,R180,T1,NN
2943,SGR?99,St,R68,Scot,R77,Scot,YY... (2 Replies)
Discussion started by: unme
2 Replies
9. Shell Programming and Scripting
with below given format,
I have been trying to find out all IDs for those entries with duplicate names in 2nd and 3rd columns and their count like how many time duplication happened for any name if any,
0.237788 Aaban Aahva
0.291066 Aabheer Aahlaad
0.845814 Aabid Aahan
0.152208 Aadam... (6 Replies)
Discussion started by: busyboy
6 Replies
10. UNIX for Beginners Questions & Answers
Hello All,
This is a noob question. I tried searching for the answer but the answer found did not help me .
I have a file that can have duplicates.
100
200
300
400
100
150
the number 100 is duplicated twice. I want to find the duplicate along with the line number.
expected... (4 Replies)
Discussion started by: vatigers
4 Replies
LEARN ABOUT DEBIAN
file::find::rule::vcs
File::Find::Rule::VCS(3pm) User Contributed Perl Documentation File::Find::Rule::VCS(3pm)
NAME
File::Find::Rule::VCS - Exclude files/directories for Version Control Systems
SYNOPSIS
use File::Find::Rule ();
use File::Find::Rule::VCS ();
# Find all files smaller than 10k, ignoring version control files
my @files = File::Find::Rule->ignore_vcs
->file
->size('<10Ki')
->in( $dir );
DESCRIPTION
Many tools need to be equally useful both on ordinary files, and on code that has been checked out from revision control systems.
File::Find::Rule::VCS provides quick and convenient methods to exclude the version control directories of several major Version Control
Systems (currently CVS, subversion, and Bazaar).
File::Find::Rule::VCS implements methods to ignore the following:
CVS
Subversion
Bazaar
In addition, the following version control systems do not create directories in the checkout and do not require the use of any ignore
methods
SVK
Git
METHODS
ignore_vcs
# Ignore all common version control systems
$find->ignore_vcs;
# Ignore a specific named version control systems
$find->ignore_vcs($name);
# Ignore nothing (silent pass-through)
$find->ignore_vcs('');
The "ignore_vcs" method excludes the files for a named Version Control System from your File::Find::Rule search.
If passed, the name of the version control system is case in-sensitive. Names currently supported are 'cvs', 'svn', 'subversion', 'bzr',
and 'bazaar'.
As a convenience for high-level APIs, if the VCS name is the defined null string '' then the call will be treated as a nullop.
If no params at all are passed, this method will ignore all supported version control systems. If ignoring every version control system,
please note that any legitimate directories called "CVS" or files starting with .# will be ignored, which is not always desirable.
In widely-distributed code, you instead should try to detect the specific version control system used and call ignore_vcs with the specific
name.
Passing "undef", or an unsupported name, will throw an exception.
ignore_cvs
The "ignore_cvs" method excluding all CVS directories from your File::Find::Rule search.
It will also exclude all the files left around by CVS after an automated merge that start with '.#' (dot-hash).
ignore_rcs
The "ignore_rcs" method excluding all RCS directories from your File::Find::Rule search.
It will also exclude all the files used by RCS to store the revisions (end with ',v').
ignore_svn
The "ignore_svn" method excluding all Subversion (".svn") directories from your File::Find::Rule search.
ignore_bzr
The "ignore_bzr" method excluding all Bazaar (".bzr") directories from your File::Find::Rule search.
ignore_git
The "ignore_git" method excluding all Git (".git") directories from your File::Find::Rule search.
ignore_hg
The "ignore_hg" method excluding all Mercurial/Hg (".hg") directories from your File::Find::Rule search.
TO DO
- Add support for other version control systems.
- Add other useful VCS-related methods
SUPPORT
Bugs should always be submitted via the CPAN bug tracker
<http://rt.cpan.org/NoAuth/ReportBug.html?Queue=File-Find-Rule-VCS>
For other issues, contact the maintainer
AUTHOR
Adam Kennedy <adamk@cpan.org>
SEE ALSO
<http://ali.as/>, File::Find::Rule
COPYRIGHT
Copyright 2005 - 2010 Adam Kennedy.
This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
The full text of the license can be found in the LICENSE file included with this module.
perl v5.10.1 2010-10-06 File::Find::Rule::VCS(3pm)