Removing large number of temp files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Removing large number of temp files
# 8  
Old 01-20-2017
Hi.

I'm with Robin on this: use xargs. You can control how many items to process, how many characters, etc. I also like the idea from jgt of rm -rf <directory>, provided care is taken that no valuable files are present in the directory.

One problem can be filenames with shell-specific meta-characters in them, the most common being a space, like t 10. Here are two methods for dealing with that situation:
Code:
#!/usr/bin/env bash

# @(#) s3       Demonstrate remove files in groups, xargs.

# Utility functions: print-as-echo, print-line-with-visual-space, debug.
# export PATH="/usr/local/bin:/usr/bin:/bin"
LC_ALL=C ; LANG=C ; export LC_ALL LANG
pe() { for _i;do printf "%s" "$_i";done; printf "\n"; }
pl() { pe;pe "-----" ;pe "$*"; }
em() { pe "$*" >&2 ; }
db() { ( printf " db, ";for _i;do printf "%s" "$_i";done;printf "\n" ) >&2 ; }
db() { : ; }
C=$HOME/bin/context && [ -f $C ] && $C /bin/ls xargs tr

# Remove debris from previous runs.

FILE=${1-data2}

# Create a number of files.

pl " Temporary files:"
rm -f t*
touch t{1..9}
/bin/ls t*

# Put temporary files into a list.
/bin/ls -1 t* > $FILE

pl " Input data file $FILE, columnized:"
column $FILE

pl " Results, xargs, default command is echo:"
xargs < $FILE

pl " Files are still present:"
/bin/ls t*
echo " Exit status from ls: $?"

pl " Results, change command to rm, files are now gone, expect message:"
xargs rm < $FILE
/bin/ls t*
echo " Exit status from ls: $?"

# Now with a file with a special character -- a space -- in it.
pl " Temporary files:"
rm -f t*
touch t{1..9} "t 10"
/bin/ls t*

pl " Results, not all files are gone:"
xargs rm < $FILE
/bin/ls t*
echo " Exit status from ls: $?"

# Again with a file with a special character -- a space -- in it.
pl " Temporary files:"
rm -f t*
touch t{1..9} "t 10"
/bin/ls t*

pl " List files with quotes around them, (displayed columnized):"
/bin/ls -Q t* |
tee $FILE |
column

pl " Results, files are now gone, expect message:"
xargs rm < $FILE
/bin/ls t*
echo " Exit status from ls: $?"

# Third time, with a file with a special character -- a space -- in it.
pl " Temporary files:"
rm -f t*
touch t{1..9} "t 10"
/bin/ls t*

pl " Add a null character to the end of each name:"
/bin/ls -1 t* |
tr '\n' '\0' |
tee $FILE 

pl " Results, files are now gone, expect message:"
xargs --null rm < $FILE
/bin/ls t*
echo " Exit status from ls: $?"

exit 0

producing:
Code:
$ ./s3

Environment: LC_ALL = C, LANG = C
(Versions displayed with local utility "version")
OS, ker|rel, machine: Linux, 3.16.0-4-amd64, x86_64
Distribution        : Debian 8.6 (jessie) 
bash GNU bash 4.3.30
/bin/ls ls (GNU coreutils) 8.23
xargs (GNU findutils) 4.4.2
tr (GNU coreutils) 8.23

-----
 Temporary files:
t1  t2  t3  t4  t5  t6  t7  t8  t9

-----
 Input data file data2, columnized:
t1      t2      t3      t4      t5      t6      t7      t8      t9

-----
 Results, xargs, default command is echo:
t1 t2 t3 t4 t5 t6 t7 t8 t9

-----
 Files are still present:
t1  t2  t3  t4  t5  t6  t7  t8  t9
 Exit status from ls: 0

-----
 Results, change command to rm, files are now gone, expect message:
/bin/ls: cannot access t*: No such file or directory
 Exit status from ls: 2

-----
 Temporary files:
t 10  t1  t2  t3  t4  t5  t6  t7  t8  t9

-----
 Results, not all files are gone:
t 10
 Exit status from ls: 0

-----
 Temporary files:
t 10  t1  t2  t3  t4  t5  t6  t7  t8  t9

-----
 List files with quotes around them, (displayed columnized):
"t 10"  "t1"    "t2"    "t3"    "t4"    "t5"    "t6"    "t7"    "t8"    "t9"

-----
 Results, files are now gone, expect message:
/bin/ls: cannot access t*: No such file or directory
 Exit status from ls: 2

-----
 Temporary files:
t 10  t1  t2  t3  t4  t5  t6  t7  t8  t9

-----
 Add a null character to the end of each name:
t 10t1t2t3t4t5t6t7t8t9
-----
 Results, files are now gone, expect message:
/bin/ls: cannot access t*: No such file or directory
 Exit status from ls: 2

See man pages for details.

Best wishes ... cheers, drl
This User Gave Thanks to drl For This Post:
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Sftp large number of files

Want to sftp large number of files ... approx 150 files will come to server every minute. (AIX box) Also need make sure file has been sftped successfully... Please let me know : 1. What is the best / faster way to transfer files? 2. should I use batch option -b so that connectivity will be... (3 Replies)
Discussion started by: vegasluxor
3 Replies

2. UNIX for Dummies Questions & Answers

Rename a large number of files in subdirectories

Hi, I have a large number of subdirectories (>200), and in each of these directories there is a file with a name like "opp1234.dat". I'd like to know how I could change the names of these files to say "out.dat" in all these subdirectories in one go. Thanks! (5 Replies)
Discussion started by: lost.identity
5 Replies

3. Shell Programming and Scripting

Using find in a directory containing large number of files

Hi All, I have searched this forum for related posts but could not find one that fits mine. I have a shell script which removes all the XML tags including the text inside the tags from some 4 million XML files. The shell script looks like this (MODIFIED): find . "*.xml" -print | while read... (6 Replies)
Discussion started by: shoaibjameel123
6 Replies

4. UNIX for Dummies Questions & Answers

Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them. In the file, each line is a full path to a file: /path/to/the files/file1 /path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies

5. Shell Programming and Scripting

Concatenation of a large number of files

Hellow i have a large number of files that i want to concatenate to one. these files start with the word 'VOICE_' for example VOICE_0000000000 VOICE_1223o23u0 VOICE_934934927349 I use the following code: cat /ODS/prepaid/CDR_FLOW/MEDIATION/VOICE_* >> /ODS/prepaid/CDR_FLOW/WORK/VOICE ... (10 Replies)
Discussion started by: chriss_58
10 Replies

6. UNIX for Dummies Questions & Answers

questing regarding tar large number of files

I want to tar large number of files about 150k. i am using the find command as below to create a file with all file names. & then trying to use the tar -I command as below. # find . -type f -name "gpi*" > include-file # tar -I include-file -cvf newfile.tar This i got from one of the posts... (2 Replies)
Discussion started by: crux123
2 Replies

7. Shell Programming and Scripting

Need help combining large number of text files

Hi, i have more than 1000 data files(.txt) like this first file format: 178.83 554.545 179.21 80.392 second file: 178.83 990.909 179.21 90.196 etc. I want to combine them to the following format: 178.83,554.545,990.909,... 179.21,80.392,90.196,... (7 Replies)
Discussion started by: mr_monocyte
7 Replies

8. Shell Programming and Scripting

Removing lines from large files.. quickest method?

Hi I have some files that contain be anything up to 100k lines - eg. file100k I have another file called file5k and I need to produce filec which will contain everything in file100k minus what matches in file 5k.. ie. File100k contains 1FP 2FP 3FP File5k contains 2FP I would... (2 Replies)
Discussion started by: frustrated1
2 Replies

9. Shell Programming and Scripting

Script to Compare a large number of files.

I have a large Filesystem on an AIX server and another one on a Red Hat box. I have syncd the two filesystems using rsysnc. What Im looking for is a script that would compare to the two filesystems to make sure the bits match up and the number of files match up. its around 2.8 million... (5 Replies)
Discussion started by: zippdawg2001
5 Replies

10. Shell Programming and Scripting

moving large number of files

I have a task to move more than 35000 files every two hours, from the same directory to another directory based on a file that has the list of filenames I tried the following logics (1) find . -name \*.dat > list for i in `cat list` do mv $i test/ done (2) cat list|xargs -i mv "{}"... (7 Replies)
Discussion started by: bryan
7 Replies
Login or Register to Ask a Question