Yes. You can even option -S (size) indicate that there was no confusion
--- Post updated at 15:05 ---
diff utility with such files should be silent
Since my directories contain binaries, and diff only works with text files, it would not help me.
--- Post updated at 10:47 AM ---
Quote:
Originally Posted by RudiC
It doesn't REALLY seem so, does it? Just from looking at it, and trying to apply some logics and common sense, I'd say SDB1_Maxtor_Drive, MAXTOR_SDB1, and NEVER_DELETE_THIS_DIRECTORY are missing in /media/andy/MAXTOR_SDB1/Ubuntu_Mate_18.04/, and 2019-02-23_23:42, and 2019-02-25_03:47 are missing in /media/andy/MAXTOR_SDB1/Linux_Files/.
The other filenames are paired and seem to exist in either dir but cannot be considered equal based on the info known thus far.
The reason for my post is this.
I use Clonezilla to make images of my main drive to a 2nd older drive.
I also make images of my 2nd drive to my main drive.
That uses a lot more space than simply copying files to my main drive.
I want to find duplicates in file on 2nd field i wrote this code:
nawk '{a++} END{for i in a {if (a>1) print}}' temp
Could not find whats wrong with this.
Appreciate help (5 Replies)
I have a folder which in turn has numerous sub folders all containing pdf files with same file named in different ways.
So I need a script if it can be written to find and print the duplicate files (That is files with same size) along with the respective paths.
So I assume here that same file... (5 Replies)
Hi All,
I am Oracle Apps Tech guy, I have a requirement to find 777 permission is there or not for all Folders and Sub-folders
Under APPL_TOP (Folder/directory) with below conditions
i) the directory names should start with xx..... (like xxau,xxcfi,xxcca...etc)
and exclude the directory... (11 Replies)
Hello,
My text file has input of the form
abc dft45.xml
ert rt653.xml
abc ert57.xml
I need to write a perl script/shell script to find duplicates in the first column and write it into a text file of the form...
abc dft45.xml
abc ert57.xml
Can some one help me plz? (5 Replies)
example data
5666700842511TAfmoham03151008075205999900000001000001000++
5666700843130MAfmoham03151008142606056667008390315100005001
6666666663130MAfmoham03151008142606056667008390315100005001
I'd like to sort on position 10-14 where the characters are eq "130MA".
Then based on positions... (0 Replies)
Hi all,
Using grep command, i want to find the pattern of text in all directories and sub-directories.
e.g: if i want to search for a pattern named "parmeter", i used the command
grep -i "param" ../*
is this correct? (1 Reply)
these are numeric ids..
222932017099186177
222932014385467392
222932017371820032
222932017409556480
I have text file having 300 millions of line as shown above. I want to find duplicates from this file. Please suggest the quicker way..
sort | uniq -d will... (3 Replies)
with below given format,
I have been trying to find out all IDs for those entries with duplicate names in 2nd and 3rd columns and their count like how many time duplication happened for any name if any,
0.237788 Aaban Aahva
0.291066 Aabheer Aahlaad
0.845814 Aabid Aahan
0.152208 Aadam... (6 Replies)
Hello All,
This is a noob question. I tried searching for the answer but the answer found did not help me .
I have a file that can have duplicates.
100
200
300
400
100
150
the number 100 is duplicated twice. I want to find the duplicate along with the line number.
expected... (4 Replies)