02-25-2019
To see which files will be deleted
fdups -f
This will display the entire list without the top files in each section.
Well, to remove the displayed list, use
fdups -Nd
But in your case I would use an interactive way. At least until you get acquainted with the subtleties of this tool.
Good luck
--- Post updated at 15:28 ---
Quote:
Originally Posted by
nezabudka
diff utility with such files should be silent
But it is not a fact if you compare binary files. The real is that the "fdupes" utility even works with binary files and "diff" is a text tool
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I want to find duplicates in file on 2nd field i wrote this code:
nawk '{a++} END{for i in a {if (a>1) print}}' temp
Could not find whats wrong with this.
Appreciate help (5 Replies)
Discussion started by: pinnacle
5 Replies
2. Shell Programming and Scripting
I have a folder which in turn has numerous sub folders all containing pdf files with same file named in different ways.
So I need a script if it can be written to find and print the duplicate files (That is files with same size) along with the respective paths.
So I assume here that same file... (5 Replies)
Discussion started by: deaddevil
5 Replies
3. Shell Programming and Scripting
Hi All,
I am Oracle Apps Tech guy, I have a requirement to find 777 permission is there or not for all Folders and Sub-folders
Under APPL_TOP (Folder/directory) with below conditions
i) the directory names should start with xx..... (like xxau,xxcfi,xxcca...etc)
and exclude the directory... (11 Replies)
Discussion started by: gagan4599
11 Replies
4. Shell Programming and Scripting
Hello,
My text file has input of the form
abc dft45.xml
ert rt653.xml
abc ert57.xml
I need to write a perl script/shell script to find duplicates in the first column and write it into a text file of the form...
abc dft45.xml
abc ert57.xml
Can some one help me plz? (5 Replies)
Discussion started by: gameboy87
5 Replies
5. UNIX for Dummies Questions & Answers
example data
5666700842511TAfmoham03151008075205999900000001000001000++
5666700843130MAfmoham03151008142606056667008390315100005001
6666666663130MAfmoham03151008142606056667008390315100005001
I'd like to sort on position 10-14 where the characters are eq "130MA".
Then based on positions... (0 Replies)
Discussion started by: mmarshall
0 Replies
6. UNIX for Dummies Questions & Answers
Hi all,
Using grep command, i want to find the pattern of text in all directories and sub-directories.
e.g: if i want to search for a pattern named "parmeter", i used the command
grep -i "param" ../*
is this correct? (1 Reply)
Discussion started by: vinothrajan55
1 Replies
7. Shell Programming and Scripting
these are numeric ids..
222932017099186177
222932014385467392
222932017371820032
222932017409556480
I have text file having 300 millions of line as shown above. I want to find duplicates from this file. Please suggest the quicker way..
sort | uniq -d will... (3 Replies)
Discussion started by: pamu
3 Replies
8. Shell Programming and Scripting
Hi All,
Input.txt
123,ABC,XYZ1,A01,IND,I68,IND,NN
123,ABC,XYZ1,A01,IND,I67,IND,NN
998,SGR,St,R834,scot,R834,scot,NN
985,SGR0399,St,R180,T15,R180,T1,YY
985,SGR0399,St,R180,T15,R180,T1,NN
985,SGR0399,St,R180,T15,R180,T1,NN
2943,SGR?99,St,R68,Scot,R77,Scot,YY... (2 Replies)
Discussion started by: unme
2 Replies
9. Shell Programming and Scripting
with below given format,
I have been trying to find out all IDs for those entries with duplicate names in 2nd and 3rd columns and their count like how many time duplication happened for any name if any,
0.237788 Aaban Aahva
0.291066 Aabheer Aahlaad
0.845814 Aabid Aahan
0.152208 Aadam... (6 Replies)
Discussion started by: busyboy
6 Replies
10. UNIX for Beginners Questions & Answers
Hello All,
This is a noob question. I tried searching for the answer but the answer found did not help me .
I have a file that can have duplicates.
100
200
300
400
100
150
the number 100 is duplicated twice. I want to find the duplicate along with the line number.
expected... (4 Replies)
Discussion started by: vatigers
4 Replies
FDUPES(1) General Commands Manual FDUPES(1)
NAME
fdupes - finds duplicate files in a given set of directories
SYNOPSIS
fdupes [ options ] DIRECTORY ...
DESCRIPTION
Searches the given path for duplicate files. Such files are found by comparing file sizes and MD5 signatures, followed by a byte-by-byte
comparison.
OPTIONS
-r --recurse
include files residing in subdirectories
-s --symlinks
follow symlinked directories
-H --hardlinks
normally, when two or more files point to the same disk area they are treated as non-duplicates; this option will change this behav-
ior
-n --noempty
exclude zero-length files from consideration
-f --omitfirst
omit the first file in each set of matches
-1 --sameline
list each set of matches on a single line
-S --size
show size of duplicate files
-q --quiet
hide progress indicator
-d --delete
prompt user for files to preserve, deleting all others (see CAVEATS below)
-v --version
display fdupes version
-h --help
displays help
SEE ALSO
md5sum(1)
NOTES
Unless -1 or --sameline is specified, duplicate files are listed together in groups, each file displayed on a separate line. The groups are
then separated from each other by blank lines.
When -1 or --sameline is specified, spaces and backslash characters () appearing in a filename are preceded by a backslash character.
CAVEATS
If fdupes returns with an error message such as fdupes: error invoking md5sum it means the program has been compiled to use an external
program to calculate MD5 signatures (otherwise, fdupes uses interal routines for this purpose), and an error has occurred while attempting
to execute it. If this is the case, the specified program should be properly installed prior to running fdupes.
When using -d or --delete, care should be taken to insure against accidental data loss.
When used together with options -s or --symlink, a user could accidentally preserve a symlink while deleting the file it points to.
Furthermore, when specifying a particular directory more than once, all files within that directory will be listed as their own duplicates,
leading to data loss should a user preserve a file without its "duplicate" (the file itself!).
AUTHOR
Adrian Lopez <adrian2@caribe.net>
FDUPES(1)