Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Fastest way to traverse through large directories Post 93752 by sreedharange on Wednesday 21st of December 2005 10:27:29 PM
Old 12-21-2005
Hi Guys,

Thanks for your input...

From what I hear here, I guess that the way I've taken is the best. The C program is pretty fast, but then, with the hundreds of thousands of files, I'm looking for greased lightning. Every day about 20,000 files get added. I'm supposed to archive the old files (those older than 60 days). Sixty days means about 1.2 million files, with the rate growing as time goes by.

I guess one way to beat the large number of files is to go deeper down the file structure. But then, I'll have to run the application multiple times, one for each folder path that I select.

Just to complete the picture, the list of files older than 60 days is then fed to NetBackup (that archiving application from Symantec). NetBackup then moves the files to tape. I use Netbackup because it becomes easy to restore particular files whenever I need them.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Unix File System performance with large directories

Hi, how does the Unix File System perform with large directories (containing ~30.000 files)? What kind of structure is used for the organization of a directory's content, linear lists, (binary) trees? I hope the description 'Unix File System' is exact enough, I don't know more about the file... (3 Replies)
Discussion started by: dive
3 Replies

2. Shell Programming and Scripting

Traverse catalogs

Here is my problem (it seems I've a lot of problems nowadays). I have several folders: runner.20070830.12.45.12 runner.20070830.12.45.15 runner.20070830.12.45.17 runner.20070830.12.45.20 runner.20070830.12.45.45 runner.20070830.12.45.55 Each catalog contains some html-files. I... (3 Replies)
Discussion started by: baghera
3 Replies

3. Shell Programming and Scripting

find in given path do not want to traverse to its sub-directories

Hi All, My UNIX Version is: OS Name Release Version AIX appma538 3 5 I want to find certain files with some criterias under the given path. At the same time i want to find the files which resides under the given directory, but normal find traverse to its sub-directories... (4 Replies)
Discussion started by: Arunprasad
4 Replies

4. Shell Programming and Scripting

Remove Duplicate Filenames in 2 very large directories

Hello Gurus, O/S RHEL4 I have a requirement to compare two linux based directories for duplicate filenames and remove them. These directories are close to 2 TB each. I have tried running a: Prompt>diff -r data1/ data2/ I have tried this as well: jason@jason-desktop:~$ cat script.sh ... (7 Replies)
Discussion started by: jaysunn
7 Replies

5. Shell Programming and Scripting

Traverse through directory....

hi I have a directory structure like Parent Parent/child1/ Parent/child2/ Parent/child3/ and the each main directory contains Parent/child1/file1.txt, Parent/child1/fil2.zip ....... Parent/child2/file1.txt,Parent/child/fil2.zip ...... Now i want to traverse to each and want to... (1 Reply)
Discussion started by: Reddy482
1 Replies

6. Shell Programming and Scripting

PERL - traverse sub directories and get test case results

Hello, I need help in creating a PERL script for parsing test result files to get the results (pass or fail). Each test case execution generates a directory with few files among which we are interested in .result file. Lets say Testing is home directory. If i executed 2 test cases. It will... (4 Replies)
Discussion started by: ravi.videla
4 Replies

7. Shell Programming and Scripting

Fastest way to delete duplicates from a large filelist.....

OK I have two filelists...... The first is formatted like this.... /path/to/the/actual/file/location/filename.jpg and has up to a million records The second list shows filename.jpg where there is more then on instance. and has maybe up to 65,000 records I want to copy files... (4 Replies)
Discussion started by: Bashingaway
4 Replies

8. OS X (Apple)

OS X 'find' nogroup/nouser doesn't traverse directories?

flamingo:~ joliver$ sudo find / -nogroup find: /dev/fd/4: No such file or directory find: /home: No such file or directory find: /Library: No such file or directory find: /net: No such file or directory find: /Network: No such file or directory find: /private: No such file or directory find:... (2 Replies)
Discussion started by: jnojr
2 Replies

9. Shell Programming and Scripting

Traverse Latest directory First

I wish to traverse the latest to the oldest directory based on its timestamp. ls -ltr drwxr-x--- 3 admin bel 1024 Jan 22 02:29 sys drwxr-x--- 2 admin bel 2048 Jan 22 02:30 admin drwxr-x--- 10 admin bel 24576 Jan 23 21:31 bin For the above i need to cd first to... (2 Replies)
Discussion started by: mohtashims
2 Replies

10. Shell Programming and Scripting

Traverse through directories

Hi all, Require help in completing a shell script. i do have a total of 90 directories where in we have different sub-directories and one of the sub directory named logs I need to go inside the logs subdirectory and check if a particular log is present or not. for example below is the... (3 Replies)
Discussion started by: bhaskar t
3 Replies
uucleanup(1M)						  System Administration Commands					     uucleanup(1M)

NAME
uucleanup - uucp spool directory clean-up SYNOPSIS
/usr/lib/uucp/uucleanup [-Ctime] [-Dtime] [-mstring] [-otime] [-ssystem] [-Wtime] [-xdebug-level] [-Xtime] DESCRIPTION
uucleanup will scan the spool directories for old files and take appropriate action to remove them in a useful way: o Inform the requester of send/receive requests for systems that can not be reached. o Return undeliverable mail to the sender. o Deliver rnews files addressed to the local system. o Remove all other files. In addition, there is a provision to warn users of requests that have been waiting for a given number of days (default 1 day). Note: uucle- anup will process as if all option times were specified to the default values unless time is specifically set. This program is typically started by the shell uudemon.cleanup, which should be started by cron(1M). OPTIONS
-Ctime Remove any C. files greater or equal to time days old and send appropriate information to the requester (default 7 days). -Dtime Remove any D. files greater or equal to time days old, make an attempt to deliver mail messages, and execute rnews when appropriate (default 7 days). -mstring Include string in the warning message generated by the -W option. The default line is "See your local administrator to locate the problem". -otime Delete other files whose age is more than time days (default 2 days). -ssystem Execute for system spool directory only. -Wtime Any C. files equal to time days old will cause a mail message to be sent to the requester warning about the delay in con- tacting the remote. The message includes the JOBID, and in the case of mail, the mail message. The administrator may include a message line telling whom to call to check the problem (-m option) (default 1 day). -xdebug-level Produce debugging output on standard ouput. debug-level is a single digit between 0 and 9; higher numbers give more detailed debugging information. (This option may not be available on all systems.) -Xtime Any X. files greater or equal to time days old will be removed. The D. files are probably not present (if they were, the X. could get executed). But if there are D. files, they will be taken care of by D. processing (default 2 days). FILES
/usr/lib/uucp directory with commands used by uucleanup internally /var/spool/uucp spool directory ATTRIBUTES
See attributes(5) for descriptions of the following attributes: +-----------------------------+-----------------------------+ | ATTRIBUTE TYPE | ATTRIBUTE VALUE | +-----------------------------+-----------------------------+ |Availability |SUNWbnuu | +-----------------------------+-----------------------------+ SEE ALSO
uucp(1C), uux(1C), cron(1M), attributes(5) SunOS 5.11 19 May 1993 uucleanup(1M)
All times are GMT -4. The time now is 04:46 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy