Delete large number of files


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Delete large number of files
# 1  
Old 04-20-2011
Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them.

In the file, each line is a full path to a file:
/path/to/the files/file1
/path/to/some other/files/file 2

I have a couple questions.

1. To remedy the spaces in the directory and file names I added " to the beginning and end of each line.
The file now looks like this:
"/path/to/the files/file1"
"/path/to/some other/files/file 2"

Would this solve the problem of the spaces in the file and directory names? Would single quotes be preferable?

2.
Code:
xargs rm -f < listoffiles.txt

xargs is a command I am familiar with. Would the number of files (90000+) cause an issue using xargs?

If so, would a do loop of kind overcome that? I saw the code below posted on another thread here. Could this be executed from the shell directly or would it need to be wrapped in a script?
Code:
while read file; do rm "$file"; done < listoffiles.txt

Thanks in advance for any suggestions.
# 2  
Old 04-20-2011
xargs -d '\n' rm -f < hugelistofiles ought to tell it to split on newlines alone without resorting to editing the input file.

Huge numbers of arguments shouldn't be a problem for xargs, that being what it was made for. It understands the maximum argument size and will split across multiple rm -f calls accordingly.

---------- Post updated at 04:56 PM ---------- Previous update was at 04:53 PM ----------

Your while loop would work, but wouldn't be nearly as efficient as xargs.
# 3  
Old 04-20-2011
Code:
xargs -d '\n' rm -f < hugelistofiles

Thank you. I am doing this on an AIX box. I have seen the above command used on Linux. Would it also work on AIX?
# 4  
Old 04-20-2011
I believe -d is a portable argument. If in doubt, consult 'man xargs' for your system. And of course it's just good sense to do a dry-run on any command that would delete things:
Code:
# Check if the number of lines in the file matches the number of lines printed by xargs
wc filename
# should run echo EXACTLY once per file because of -n 1, which should be EXACTLY
# once per line if splitting isn't happening on spaces because of -d '\n'
xargs -d '\n' -n 1 echo rm -f < hugelistofiles | wc

If the count of lines matches, then it's not splitting on spaces.
# 5  
Old 04-20-2011
Quote:
Originally Posted by inakajin
. Could this be executed from the shell directly or would it need to be wrapped in a script?
Code:
while read file; do rm "$file"; done < listoffiles.txt

Thanks in advance for any suggestions.
Just try it out on the command line. It takes less than 5 seconds to execute it than posting and waiting to get an answer. Put an "echo" in front of the rm command to see if your files are properly interpreted

Code:
$> while read -r file; do echo rm -f "$file"; done < listoffiles.txt

this is the most straightforward way doing it using the shell without any external tools
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Removing large number of temp files

Hi All, I am having a situation now to delete a huge number of temp files created during run times approx. 16700+ files. We have never imagined that we will get this this much big list of files during run time. It worked fine for lesser no of files in the list. But when list is huge we are... (7 Replies)
Discussion started by: mad man
7 Replies

2. Shell Programming and Scripting

Sftp large number of files

Want to sftp large number of files ... approx 150 files will come to server every minute. (AIX box) Also need make sure file has been sftped successfully... Please let me know : 1. What is the best / faster way to transfer files? 2. should I use batch option -b so that connectivity will be... (3 Replies)
Discussion started by: vegasluxor
3 Replies

3. UNIX for Dummies Questions & Answers

Rename a large number of files in subdirectories

Hi, I have a large number of subdirectories (>200), and in each of these directories there is a file with a name like "opp1234.dat". I'd like to know how I could change the names of these files to say "out.dat" in all these subdirectories in one go. Thanks! (5 Replies)
Discussion started by: lost.identity
5 Replies

4. UNIX for Dummies Questions & Answers

Delete large number of columns rom file

Hi, I have a data file that contains 61 columns. I want to delete all the columns except columns, 3,6 and 8. The columns are tab de-limited. How would I achieve this on the terminal? Thanks (2 Replies)
Discussion started by: lost.identity
2 Replies

5. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

6. Shell Programming and Scripting

Concatenation of a large number of files

Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example VOICE_0000000000 VOICE_1223o23u0 VOICE_934934927349 I use the following code: cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE ... (10 Replies)
Discussion started by: chriss_58
10 Replies

7. UNIX for Dummies Questions & Answers

questing regarding tar large number of files

I want to tar large number of files about 150k. i am using the find command as below to create a file with all file names. & then trying to use the tar -I command as below. # find . -type f -name "gpi*" > include-file # tar -I include-file -cvf newfile.tar This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies

8. Shell Programming and Scripting

Need help combining large number of text files

Hi, i have more than 1000 data files(.txt) like this first file format: 178.83 554.545 179.21 80.392 second file: 178.83 990.909 179.21 90.196 etc. I want to combine them to the following format: 178.83,554.545,990.909,... 179.21,80.392,90.196,... (7 Replies)
Discussion started by: mr_monocyte
7 Replies

9. Shell Programming and Scripting

Script to Compare a large number of files.

I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc. What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up. its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies

10. Shell Programming and Scripting

moving large number of files

I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames I tried the following logics (1) find . -name \*.dat > list for i in `cat list` do mv $i test/ done (2) cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies
Login or Register to Ask a Question