deleting 100k log files quickly


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers deleting 100k log files quickly
# 1  
Old 08-28-2002
deleting 100k log files quickly

I've walked into a while loop gone bad.....which has created 100k+ log files.

Is it quicker removing the files with rm pattern* or actually removing the entire directory with rm -rf dir/

It's taking ages (hours) either way ...just curious if one is goig to be quicker than the other...or is it actaully using the same logic for deletion?
# 2  
Old 08-28-2002
Don't know which is faster if either - care to run your loop and time them both?

Smilie
# 3  
Old 08-28-2002
Yeah sure....perhaps I could mow your lawns as well - and wash your car?

It's done now....didn't 'seem' to be quicker either way... am just curious about the logic that rm -rf uses vs rm *. Does it simply expand the 'options' much as rm * does?
# 4  
Old 08-28-2002
Do I understand that you removed 100,000 files or more? If so "rm *" would not work. A whole bunch of "rm pattern*" could do it. So could "ls | xargs rm". I would go with "rm -rf /directory" just as you did. It's the least work for me.

Which is easiest on the system? The difference will be minor, it's the 100,000 unlink() system calls that take forever. But "rm -rf /directory" is one process while the other techniques are many processes. That already gives it a edge. What's more, the other techniques are creating processes with long argument lists. The shell will fork() and exec() to create a process. exec() causes a new program to overlay the program that called exec(). But first, the envirinment and the arguments must be saved so that they can be passed to the program.

So you probably saved several seconds! Smilie Smilie

But seriously, I appaud your curiosity. An understanding of the internal operation of unix can often be very useful. But in this case, anyway yout it, you still had to do 100,000 unlinks.
# 5  
Old 08-28-2002
I did patterns..... but I like the xargs option....

So if I saved myself several seconds - I guess I spent more than that figuring out that I did!

I'm guessing from your explanation then that the effort (ergo time) involved in removing one file of 1Mb is considerably less than 1,000 odd files of 1k.
# 6  
Old 08-29-2002
Quote:
Originally posted by peter.herlihy
I'm guessing from your explanation then that the effort (ergo time) involved in removing one file of 1Mb is considerably less than 1,000 odd files of 1k.
Absolutely! And this would be true even if the 1000 files were scattered around in several different directories. But unix searches directories sequentially. When you put thousands of files in one directory the performance goes into the toilet.
# 7  
Old 08-29-2002
Quote:
But unix searches directories sequentially. When you put thousands of files in one directory the performance goes into the toilet.
You're saying it's easier on Unix to remove a thousand files from a thousand different directories than it is to remove them from one directory?

Or do you mean that it removes files sequentially within the directory.. in which case, sequentially or randomly, it's still removing every file in the directory, so why is that slower?
 
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Deleting dated log files whihc are older than an year

Hi. There is a process which creates log files in the below naming format processname.20140425.log processname1.20140425.log And in this path along with these logs there are other files which have the timestamp from 2012 and i want to have a script in place to delete only the above log files... (13 Replies)
Discussion started by: nanz143
13 Replies

2. Solaris

ZFS does not release space even after deleting application log files in a non-global zone

Hi Guys, I have a non-global zone in which has apache application on it. There is a ZFS filesystem where the app saves the log. Even after deleting the logfiles I dont see the space being freed up. There are no snapshots or anything at all Zpool info NAME SIZE ALLOC FREE CAP HEALTH ALTROOT... (8 Replies)
Discussion started by: RDX
8 Replies

3. Shell Programming and Scripting

Getting folder more than 100K size

Hi , I am trying to get the folder details having size more than sme specified value and also the name of the folder should be like TEST. so 1. In the current directory search for all the folders having name like TEST 2. Print the list of the folder names having size more than 100... (3 Replies)
Discussion started by: Anupam_Halder
3 Replies

4. Shell Programming and Scripting

Need to delete large set of files (i.e) close to 100K from a directory based on the input file

Hi all, I need a script to delete a large set of files from a directory under / based on an input file and want to redirect errors into separate file. I have already prepared a list of files in the input file. Kndly help me. Thanks, Prash (36 Replies)
Discussion started by: prash358
36 Replies

5. Shell Programming and Scripting

Moving 100K file to another folder using 1 command

Hi, I need to move 1000s of files from one folder to another. Actually there are 100K+ files. Source dir : source1 Target dir : target1 Now if try cp or mv commands I am getting an error message : Argument List too long. I tried to do it by the time the files are created in the source... (6 Replies)
Discussion started by: unx100
6 Replies

6. Shell Programming and Scripting

Kron Shell: deleting all but most recent log files

I am trying to create a Korn Shell script to be run every 5-10 minute from a crontab. This script needs to look for log files (transaction_<date>.log). If there are more than 5 such files, it needs to delete all but the most current 5. How often these files are create varies - can be every minute... (2 Replies)
Discussion started by: WmShaw
2 Replies

7. UNIX for Dummies Questions & Answers

need solution for this quickly. please quickly.

Write a nawk script that will produce the following report: ***FIRST QUARTERLY REPORT*** ***CAMPAIGN 2004 CONTRIBUTIONS*** ------------------------------------------------------------------------- NAME PHONE Jan | ... (5 Replies)
Discussion started by: p.palakj.shah
5 Replies

8. Shell Programming and Scripting

command for deleting log files based on some condition

Hello, Can anyone pls. provide me with the command for deleting files older then 15 days with a restriction to keep at least 5 files in a directory even if they are older then 15 days. Any help will be highly appreciated. Thanks, Pulkit (4 Replies)
Discussion started by: pulkit
4 Replies

9. UNIX for Dummies Questions & Answers

deleting log files only in particular directories

Hi My problem is i have to remove some log files in specific named directories on a regular basis using shell scripts. What i want my shell script to do is i give the shell script some listing of directories from which to delete all log files recursively. Can anyone please help me. ... (2 Replies)
Discussion started by: sameervs
2 Replies
Login or Register to Ask a Question