How to delete a huge number of files at a time


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting How to delete a huge number of files at a time
# 8  
Old 12-30-2010
xargs could be used to cut the workload in pieces, but the trouble with (non-GNU) xargs is that it does not work for file names with spaces in the name, since the input delimiter cannot be specified.
But how about something like this
Code:
ls | grep 'file.......\.dat' |
( IFS="
" 
  set --
  i=0
  while read file
  do
   set -- $@ $file
   if [ $((i+=1)) -ge 32 ]; then
      rm $@
      i=0
      set --
    fi
  done
  rm $@
)

The number 32 could be changed to 16 or some other number..

This should probably work too:
Code:
( IFS="
" 
  set --
  i=0
  for f in file???????.dat
  do
   set -- $@ "$f"
   if [ $((i+=1)) -ge 32 ]; then
      ls $@
      i=0
      set --
    fi
  done
  ls $@
)


Last edited by Scrutinizer; 12-30-2010 at 02:27 PM..
# 9  
Old 12-30-2010
Quote:
Code:
ls file???????.dat |while read file
do
echo "Deleting file $file"
rm $file
done
The above code is not suitable for 5 million files for three reasons:
1) The expanded "ls" command will still be too long.
2) The "ls" program sorts the filenames to alphabetical order - a massive overhead in this circumstance.
3) It does not correctly deal with filenames containing space characters.


Any chance of an answer to my earlier questions?
# 10  
Old 12-30-2010
@sandholm is quite correct:
Quote:
Originally Posted by sandholm
xargs is your friend.
And so is grep.

If you want to delete the files and monitor how many are left:
Code:
cd directory
ls -f1 | grep file.......\.dat$ | xargs -P 4 rm -f &
while sleep 10; do
ls -f1 | grep -c file.......\.dat
done

With a large number of files, the -f option keeps ls from sorting -- in what order the files are deleted, or counted, should not matter.

This is a quick-n-dirty script, ymmv.

---------- Post updated at 09:30 AM ---------- Previous update was at 09:26 AM ----------

If the directory only contains these files, it would be easier to:
Code:
rm -rf directory
mkdir directory

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help for Count number of files in certain time

Chaps, I need to count number of files in a remote directory from Linux (FreeBSD) as if 10 trace files (log files) been generated within 5min of time. So this is the script then I can setup a monitoring. I came across with ls -1 \ip\d:\Logs | wc -l but then what else requires to check time... (8 Replies)
Discussion started by: samwijekoon
8 Replies

2. UNIX for Beginners Questions & Answers

Count the number of files to delete doesnt match

Good evening, need your help please Need to delete certain files before octobre 1 2016, so need to know how many files im going to delete, for instance ls -lrt file_20160*.lis!wc -l but using grep -c to another file called bplist which contains the list of all files backed up doesn match... (7 Replies)
Discussion started by: alexcol
7 Replies

3. Shell Programming and Scripting

Split a folder with huge number of files in n folders

We have a folder XYZ with large number of files (>350,000). how can i split the folder and create say 10 of them XYZ1 to XYZ10 with 35,000 files each. (doesnt matter which files go where). (12 Replies)
Discussion started by: AlokKumbhare
12 Replies

4. Shell Programming and Scripting

Search and replace ---A huge number of files

Hello Friends, I have the below scenario in my current project. Suggest me which tool ( perl,python etc) is best to this scenario. Or should I go for Programming language ( C/Java ).. (1) I will be having a very big file ( information about 200million subscribers will be stored in it ). This... (5 Replies)
Discussion started by: panyam
5 Replies

5. Shell Programming and Scripting

Shell Script to delete files within a particular time frame under multiple sub folders

Greetings! I'm looking for starting information for a shell script. Here's my scenario: I have multiple folders(100) for example: /www/test/applications/app1/logs /www/test/applications/app2/logs Within these folders there are log files files that need to be deleted after a month. ... (3 Replies)
Discussion started by: whysolucky
3 Replies

6. Shell Programming and Scripting

search a number in very very huge amount of data

Hi, I have to search a number in a very long listing of files.the total size of the files in which I have to search is 10 Tera Bytes. How to search a number in such a huge amount of data effectively.I used fgrep but it is taking many hours to search. Is there any other feasible solution to... (3 Replies)
Discussion started by: vsachan
3 Replies

7. UNIX for Dummies Questions & Answers

Delete large number of files

Hi. I need to delete a large number of files listed in a txt file. There are over 90000 files in the list. Some of the directory names and some of the file names do have spaces in them. In the file, each line is a full path to a file: /path/to/the files/file1 /path/to/some other/files/file 2... (4 Replies)
Discussion started by: inakajin
4 Replies

8. UNIX for Dummies Questions & Answers

Need to delete the files based on the time stamp of the file

Hi Everyone, I want to delete some files in a path based on the time stamp of the file that is i want to delete the file once in a month. Can any one help me on this? Thanks in advance (2 Replies)
Discussion started by: samudha
2 Replies

9. Shell Programming and Scripting

Delete lines from huge file

I have to delete 1st 7000 lines of a file which is 12GB large. As it is so large, i can't open in vi and delete these lines. Also I found one post here which gave solution using perl, but I don't have perl installed. Also some solutions were redirecting the o/p to a different file and renaming it.... (3 Replies)
Discussion started by: rahulrathod
3 Replies

10. Filesystems, Disks and Memory

Time taken for creation of a huge core file

Hi, I needed to know how I can find out the time needed for an Unix machine(HP) to create a corefile as huge as 500MB(core created either by a SEGV or a kill -6 command). An approximate figure of the time taken would be really helpful.:confused: (4 Replies)
Discussion started by: nayeem
4 Replies
Login or Register to Ask a Question