Sponsored Content
Full Discussion: Find duplicate files
Top Forums Shell Programming and Scripting Find duplicate files Post 302434335 by Greco on Friday 2nd of July 2010 07:22:20 AM
Old 07-02-2010
I use this script to find and remove duplicate files on linux system Script: rmdupe IgnorantGuru's Blog and use Clone Remover from Ashisoft to find duplicate files on windows.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

how to find duplicate files with find ?

hello all I like to make search on files , and the result need to be the files that are duplicated? (8 Replies)
Discussion started by: umen
8 Replies

2. Shell Programming and Scripting

Find duplicate value comparing 2 files and create an output

I need a perl script which will create an output file after comparing two diff file in a directory path: /export/home/abc/file1 /export/home/abc/file2 File Format: <IP>TAB<DeviceName><TAB>DESCRIPTIONS file1: 10.1.2.1.3<tab>abc123def<tab>xyz.mm1.ppp.... (2 Replies)
Discussion started by: ricky007
2 Replies

3. Shell Programming and Scripting

Find Duplicate files, not by name

I have a directory with images: -rw-r--r-- 1 root root 26216 Mar 19 21:00 020109.210001.jpg -rw-r--r-- 1 root root 21760 Mar 19 21:15 020109.211502.jpg -rw-r--r-- 1 root root 23144 Mar 19 21:30 020109.213002.jpg -rw-r--r-- 1 root root 31350 Mar 20 00:45 020109.004501.jpg -rw-r--r-- 1 root... (2 Replies)
Discussion started by: Ikon
2 Replies

4. Shell Programming and Scripting

Find duplicate files by file size

Hi! I want to find duplicate files (criteria: file size) in my download folder. I try it like this: find /Users/frodo/Downloads \! -type d -exec du {} \; | sort > /Users/frodo/Desktop/duplicates_1.txt; cut -f 1 /Users/frodo/Desktop/duplicates_1.txt | uniq -d | grep -hif -... (9 Replies)
Discussion started by: Dirk Einecke
9 Replies

5. Shell Programming and Scripting

find duplicate string in many different files

I have more than 100 files like this: SVEAVLTGPYGYT 2 SVEGNFEETQY 10 SVELGQGYEQY 28 SVERTGTGYT 6 SVGLADYNEQF 21 SVGQGYEQY 32 SVKTVLGYEQF 2 SVNNEQF 12 SVRDGLTNSPLH 3 SVRRDREGLEQF 11 SVRTSGSYEQY 17 SVSVSGSPLQETQY 78 SVVHSTSPEAF 59 SVVPGNGYT 75 (4 Replies)
Discussion started by: xshang
4 Replies

6. Shell Programming and Scripting

Find duplicate files but with different extensions

Hi ! I wonder if anyone can help on this : I have a directory: /xyz that has the following files: chsLog.107.20130603.gz chsLog.115.20130603 chsLog.111.20130603.gz chsLog.107.20130603 chsLog.115.20130603.gz As you ca see there are two files that are the same but only with a minor... (10 Replies)
Discussion started by: fretagi
10 Replies

7. Shell Programming and Scripting

Find duplicate rows between files

Hi champs, I have one of the requirement, where I need to compare two files line by line and ignore duplicates. Note, I hav files in sorted order. I have tried using the comm command, but its not working for my scenario. Input file1 srv1..development..employee..empname,empid,empdesg... (1 Reply)
Discussion started by: Selva_2507
1 Replies

8. Shell Programming and Scripting

Find help in shell - that clears away duplicate files

I am so frustrated!!! I want a nice command that clears away duplicate files: find . -type f -regex '.*{1,3}\..*' | xargs -I## rm -v '##' should work in my opinion. But it finds nothing even though I have files that have the file name: Scooby-Doo-1.txt Himalaya-2.jpg Camping... (8 Replies)
Discussion started by: Mr.Glaurung
8 Replies

9. Shell Programming and Scripting

To Find Duplicate files using latest in Linux

I have tried the following code and with that i couldnt achieve what i want. #!/usr/bin/bash find ./ -type f \( -iname "*.xml" \) | sort -n > fileList sed -i '/\.\/fileList/d' fileList NAMEOFTHISFILE=$(echo $0|sed -e 's/\/()$*.^|/\\&/g') sed -i "/$NAMEOFTHISFILE/d"... (2 Replies)
Discussion started by: gold2k8
2 Replies

10. UNIX for Advanced & Expert Users

AIX find duplicate backup files

I would like find and delete old backup files in aix. How would I go about doing this? For example: server1_1-20-2020 server1_1-21-2020 server1_1-22-2020 server1_1-23-2020 server2_1-20-2020 server2_1-21-2020 server2_1-22-2020 server2_1-23-2020 How would I go about finding and... (3 Replies)
Discussion started by: cokedude
3 Replies
JIGDO-LITE(1)															     JIGDO-LITE(1)

NAME
jigdo-lite - Download jigdo files using wget SYNOPSIS
jigdo-lite [ URL ] DESCRIPTION
See jigdo-file(1) for an introduction to Jigsaw Download. Given the URL of a `.jigdo' file, jigdo-lite downloads the large file (e.g. a CD image) that has been made available through that URL. wget(1) is used to download the necessary pieces of administrative data (contained in the `.jigdo' file and a corresponding `.template' file) as well as the many pieces that the large file is made from. The jigdo-file(1) utility is used to reconstruct the large file from the pieces. `.jigdo' files that contain references to Debian mirrors are treated specially: When such a file is recognized, you are asked to select one mirror out of a list of all Debian mirrors. If URL is not given on the command line, the script prompts for a location to download the `.jigdo' file from. The following command line options are recognized: -h --help Output short summary of command syntax. -v --version Output version number. --scan FILES Do not ask for "Files to scan", use this path. --noask Do not ask any questions, instead behave as if the user had pressed Return at all prompts. This can be useful when running jigdo- lite from cron jobs or in other non-interactive environments. SEE ALSO
jigdo-file(1), jigdo-mirror(1), wget(1) (or `info wget') CD images for Debian Linux can be downloaded with jigdo <URL:http://www.debian.org/CD/jigdo-cd/>. AUTHOR
Jigsaw Download <URL:http://atterer.net/jigdo/> was written by Richard Atterer <jigdo atterer.net>, to make downloading of CD ROM images for the Debian Linux distribution more convenient. 19 May 2006 JIGDO-LITE(1)
All times are GMT -4. The time now is 03:36 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy