diff is a utility that compares text files: you give it two text files and it will tell you the differences between these two. Up to now i didn't know that the GNU-version can compare directories too but obviously it can. I have learned something new today.
Two understand how diff works let us suppose for the moment it works on lines only (it doesn't). In principle there are three possibilities:
1) a line is present in both files
2) a line is present in file 1 (only) but not in file 2
3) a line is present in file 2 (only) but not in file 1
This is the situation you have here. Your output means the two directories will contain the same files once you:
1) copy Briggs_Stratton_Generator.zip from /media/andy/MAXTOR_SDB1/Linux_Files to /media/andy/MAXTOR_SDB1/Ubuntu_Mate_18.04
2) copy Brinkmann_8109415-W.zip from /media/andy/MAXTOR_SDB1/Ubuntu_Mate_18.04 to /media/andy/MAXTOR_SDB1/Linux_Files
3) copy Brother_2240_Drivers.zip also from /media/andy/MAXTOR_SDB1/Linux_Files to /media/andy/MAXTOR_SDB1/Ubuntu_Mate_18.04
Notice, though, that two files which have the same name (all others not mentioned in the output) do not necessarily be the same: they still could differ in content, so you would have to compare file sizes too as a first step and even if sizes are the same it might be that the content is different. You would have to use diff again (this time on the two individual files) to find out.
I want to find duplicates in file on 2nd field i wrote this code:
nawk '{a++} END{for i in a {if (a>1) print}}' temp
Could not find whats wrong with this.
Appreciate help (5 Replies)
I have a folder which in turn has numerous sub folders all containing pdf files with same file named in different ways.
So I need a script if it can be written to find and print the duplicate files (That is files with same size) along with the respective paths.
So I assume here that same file... (5 Replies)
Hi All,
I am Oracle Apps Tech guy, I have a requirement to find 777 permission is there or not for all Folders and Sub-folders
Under APPL_TOP (Folder/directory) with below conditions
i) the directory names should start with xx..... (like xxau,xxcfi,xxcca...etc)
and exclude the directory... (11 Replies)
Hello,
My text file has input of the form
abc dft45.xml
ert rt653.xml
abc ert57.xml
I need to write a perl script/shell script to find duplicates in the first column and write it into a text file of the form...
abc dft45.xml
abc ert57.xml
Can some one help me plz? (5 Replies)
example data
5666700842511TAfmoham03151008075205999900000001000001000++
5666700843130MAfmoham03151008142606056667008390315100005001
6666666663130MAfmoham03151008142606056667008390315100005001
I'd like to sort on position 10-14 where the characters are eq "130MA".
Then based on positions... (0 Replies)
Hi all,
Using grep command, i want to find the pattern of text in all directories and sub-directories.
e.g: if i want to search for a pattern named "parmeter", i used the command
grep -i "param" ../*
is this correct? (1 Reply)
these are numeric ids..
222932017099186177
222932014385467392
222932017371820032
222932017409556480
I have text file having 300 millions of line as shown above. I want to find duplicates from this file. Please suggest the quicker way..
sort | uniq -d will... (3 Replies)
with below given format,
I have been trying to find out all IDs for those entries with duplicate names in 2nd and 3rd columns and their count like how many time duplication happened for any name if any,
0.237788 Aaban Aahva
0.291066 Aabheer Aahlaad
0.845814 Aabid Aahan
0.152208 Aadam... (6 Replies)
Hello All,
This is a noob question. I tried searching for the answer but the answer found did not help me .
I have a file that can have duplicates.
100
200
300
400
100
150
the number 100 is duplicated twice. I want to find the duplicate along with the line number.
expected... (4 Replies)