Too many files to list / remove | Unix Linux Forums | UNIX for Dummies Questions & Answers

  Go Back    


UNIX for Dummies Questions & Answers If you're not sure where to post a UNIX or Linux question, post it here. All UNIX and Linux newbies welcome !!

Too many files to list / remove

UNIX for Dummies Questions & Answers


Closed Thread    
 
Thread Tools Search this Thread Display Modes
    #1  
Old 08-01-2002
dinplant dinplant is offline
Registered User
 
Join Date: Jun 2001
Last Activity: 25 March 2006, 9:49 AM EST
Posts: 6
Thanks: 0
Thanked 0 Times in 0 Posts
Error Too many files to list / remove

I have a directory which has 614,000 files.

When attempting to do an ls -ralt in the directory an error too many arguments is shown.

1. I would like to see what files and their stats in the directory
2. I would like to delete certain files of a certain age
Sponsored Links
    #2  
Old 08-01-2002
peter.herlihy peter.herlihy is offline
Registered User
 
Join Date: Nov 2001
Last Activity: 1 August 2006, 11:51 AM EDT
Location: New Zealand
Posts: 333
Thanks: 0
Thanked 0 Times in 0 Posts
You should be able to manage this with a simple for loop.

for each_file in /path/directory/*
do
echo ls -latr $each_file >> directory_log
done

Now you can't have your cake and eat it to... so if you want to see this list first (beofre you remove the old files) then you can view this file.

If you then want to remove old files...

find /path/directory/ -mtime +29 -exec ls {} \;

Where +29 says anything over 29 days in the specified directory.

You could combnie these into one command if you want...but I'll leave you something to do!

Oh postscript.....the mtime is date modified !
Sponsored Links
    #3  
Old 08-01-2002
RTM's Avatar
RTM RTM is offline Forum Advisor  
Registered User
 
Join Date: Apr 2002
Last Activity: 3 April 2014, 2:50 PM EDT
Location: On my motorcycle
Posts: 3,092
Thanks: 1
Thanked 30 Times in 9 Posts
Peter's quote:
Quote:
If you then want to remove old files...
find /path/directory/ -mtime +29 -exec ls {} \;
I believe he meant -exec rm {} \; . When testing this, I always put the ls instead of rm.

Another thought (I can't test it - none of my servers have that many files in one directory) is that Peter's -exec ls would give you a listing with no problem (it would not be ls -latr) but you would be able to see all the files.
    #4  
Old 08-02-2002
Perderabo's Avatar
Perderabo Perderabo is offline Forum Staff  
Unix Daemon (Administrator Emeritus)
 
Join Date: Aug 2001
Last Activity: 28 August 2014, 9:35 PM EDT
Location: Ashburn, Virginia
Posts: 9,921
Thanks: 60
Thanked 431 Times in 257 Posts
Handling a directory this large is going to require very careful attention to performance considerations. I usually hold my tongue when I see someone suggest the -exec option on a "find" command. But in this case, it will be a very large problem. A command like:

find /path/directory/ -mtime +29 -exec ls {} \;

is going to launch one "ls" process for each file. In this case, that is way too many. We need to get as many files on the "ls" (or "rm") command line as possible. That way, a single process will be handling dozens or maybe hundreds of files at once. We can do this with:

cd /path/directory
find . -mtime +29 -print | xargs ls -d

(I always use -d in a case like this in case the "find" output a subdirectory.) By cd'ing to the directory first and then use "." in the "find" command, we shorten the pathname that find will output. This means that xargs can collect more of them for each "ls" process that it invokes.

Using xargs is always better than -exec, but with a small number of files, it's not a big deal.

Peter may have meant "ls", the OP did request help obtaining such a listing. But can anyone read a listing that is 600,000 lines long? There is really no point to such a listing.

Any shell script written to process these files will also need careful attention to performance.
This:

for each_file in /path/directory/*

is not going to work. The shell will try to expand that asterisk and it will fail. Something like this:

#! /usr/bin/ksh
cd /path/directory
find . -print | while read each_file ; do

will work, but whatever the loop does it must be carefully coded. It must use only shell built-in commands and maybe some pre-launched co-processes. Invoking even 4 or 5 processes per loop will mean millions of total processes. Such a script would take a very long time to run.
Sponsored Links
Closed Thread

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

More UNIX and Linux Forum Topics You Might Find Helpful
Thread Thread Starter Forum Replies Last Post
Remove files from subdirectories given a list of filenames yogeshkumkar Shell Programming and Scripting 15 03-15-2012 01:09 PM
Remove first row in a list of files ksexton Shell Programming and Scripting 4 06-23-2011 07:14 AM
remove characters from list of files plener UNIX for Dummies Questions & Answers 1 01-30-2011 11:08 AM
read list of filenames from text file and remove these files in multiple directories fxvisions Shell Programming and Scripting 5 08-07-2008 03:59 PM
Can't list or remove files from a directory m_smith UNIX for Advanced & Expert Users 2 06-25-2008 05:27 PM



All times are GMT -4. The time now is 10:03 AM.