Sponsored Content
Top Forums Shell Programming and Scripting Fastest way to delete duplicates from a large filelist..... Post 302537329 by Bashingaway on Thursday 7th of July 2011 08:15:29 PM
Old 07-07-2011
Fastest way to delete duplicates from a large filelist.....

OK

I have two filelists......

The first is formatted like this....

/path/to/the/actual/file/location/filename.jpg

and has up to a million records

The second list shows filename.jpg where there is more then on instance.

and has maybe up to 65,000 records

I want to copy files only (i.e. not retaining the full path) from the first filelist as long as that filename does not appear in the second list.

At the moment I have a script that roughly does this....

Code:
FULLPATHFILENAME=`cat /FileWithPath.txt`
DUPLICATESLIST=`cat /DuplicateFiles.txt`

for REMOVEDUP in $FULLPATHFILENAME ; do

  for THISDUP in $DUPLICATESLIST ; do

         ISITADUP=`echo $REMOVEDUP | grep -v $THISDUP`

         echo "$ISITADUP" >> /ListWithoutDups.txt

 done

done

As you can see the script pulls up a record from the "with the path filelist" and does an inverse grep to see if the filename is in the duplicate list, if it isn't then it outputs that filename with it's path to /ListWithoutDups.txt In the actual script it actually does some copies and other actions on the file.

This is a pretty inefficient way of doing it IMHO as it has to pull in each record individually and then check to see if it's in the duplicates list (and that could mean 1m records * 60,000 duplicate checks).

Can anyone suggest a better/more efficient way to code this to achieve the same result?

Thanks

Last edited by Bashingaway; 07-08-2011 at 02:52 AM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

fastest way to remove duplicates.

I have searched the FAQ - by using sort, duplicates, etc.... but I didn't get any articles or results on it. Currently, I am using: sort -u file1 > file2 to remove duplicates. For a file size of 1giga byte approx. time taken to remove duplicates is 1hr 21 mins. Is there any other faster way... (15 Replies)
Discussion started by: radhika
15 Replies

2. UNIX for Dummies Questions & Answers

Fastest way to traverse through large directories

Hi! I have thousands of sub-directories, and hundreds of thousands of files in them. What is the fast way to find out which files are older than a certain date? Is the "find" command the fastest? Or is there some other way? Right now I have a C script that traverses through and checks... (5 Replies)
Discussion started by: sreedharange
5 Replies

3. Shell Programming and Scripting

how to delete/remove directory in fastest way

hello i need help to remove directory . The directory is not empty ., it contains several sub directories and files inside that.. total number of files in one directory is 12,24,446 . rm -rf doesnt work . it is prompting for every file .. i want to delete without prompting and... (6 Replies)
Discussion started by: getdpg
6 Replies

4. Shell Programming and Scripting

An interactive way to delete duplicates

1)I am trying to write a script that works interactively lists duplicated records on certain field/column and asks user to delete one or more. And finally it deletes all the records the used has asked for. I have an idea to store those line numbers in an array, not sure how to do this in... (3 Replies)
Discussion started by: chvs2000
3 Replies

5. Shell Programming and Scripting

how can I delete duplicates in the log?

I have a log file and I am trying to run a script against it to search for key issues such as invalid users, errors etc. In one part, I grep for session closed and get a lot of the same thing,, ie. root username etc. I want to remove the multiple root and just have it do a count, like wc -l ... (5 Replies)
Discussion started by: taekwondo
5 Replies

6. Shell Programming and Scripting

Fastest way to delete line

I have a 5 GB text file(log/debug) I want to delete all lines containing 'TRACE' Command used sed -i '/TRACE/d' mylog.txt Is there any other fastest way to do this? (1 Reply)
Discussion started by: johnbach
1 Replies

7. Shell Programming and Scripting

Delete duplicates via script?

Hello, i have the following problem: there are two folders with a lot of files. Example: FolderA contains AAA, BBB, CCC FolderB contains DDD, EEE, AAA How can i via script identify AAA as duplicate in Folder B and delete it there? So that only DDD and EEE remain, in Folder B? Thank you... (16 Replies)
Discussion started by: Y-T
16 Replies

8. Shell Programming and Scripting

Delete duplicates in CA bundle

I do have a big CA bundle certificate file and each time if i get request to add new certificate to the existing bundle i need to make sure it is not present already. How i can validate the duplicates. The alignment of the certificate within the bundle seems to be different. Example: Cert 1... (7 Replies)
Discussion started by: diva_thilak
7 Replies

9. Shell Programming and Scripting

Delete only if duplicates found in each record

Hi, i have another problem. I have been trying to solve it by myself but failed. inputfile ;; ID T08578 NAME T08578 SBASE 30696 EBASE 32083 TYPE P func just test func chronology func cholesterol func null INT 30765-37333 INT 37154-37318 Link 5546 Link 8142 (4 Replies)
Discussion started by: redse171
4 Replies

10. Shell Programming and Scripting

To Delete the duplicates using Part of File Name

I am using the below script to delete duplicate files but it is not working for directories with more than 10k files "Argument is too long" is getting for ls -t. Tried to replace ls -t with find . -type f \( -iname "*.xml" \) -printf '%T@ %p\n' | sort -rg | sed -r 's/* //' | awk... (8 Replies)
Discussion started by: gold2k8
8 Replies
MONGOFILES(1)							  Mongo Database						     MONGOFILES(1)

NAME
mongofiles - a simple GridFS interface SYNOPSIS
mongofiles [OPTIONS] DESCRIPTION
mongofiles is used to list, get, and insert files in the database. EXAMPLES
mongofiles list lists files in test.fs.files mongofiles put README.txt inserts the file README.txt into the collection test.fs.files mongofiles get photo.jpg retrieves photo.jpg from test.fs.files and saves it locally OPTIONS
--help show usage information -h, --host HOST mongo host to which to connect -d, --db DB database to use (default DB=test) -c, --collection COLLECTION (default COLLECTION=fs.files) collection to use --command [list|search|put|get] execute a command --file FILE filename for get or put list list all files. takes an optional filename. the file has to start with the filename search search all files for something that contains the string COPYRIGHT
Copyright 2007-2009 10gen SEE ALSO
For more information, please refer to the MongoDB wiki, available at http://www.mongodb.org. AUTHOR
Kristina Chodorow 10gen June 2009 MONGOFILES(1)
All times are GMT -4. The time now is 07:02 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy