help on most efficient search


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting help on most efficient search
# 1  
Old 04-05-2008
help on most efficient search

Hello,

We have a directory with 15 sub-directories where each sub-directory contains 1.5 to 2 lakhs of files in it. Daily, around 300-500 files will be uploaded to each sub-directory.

Now, i need to get the list of files received today in most efficient way. I tried using "find with newer option" and also the "ls -ltr with tail" but both are taking long time to provide the list of images received today.

Please advise me on the most efficient way (should take least time possible) to find today's files.


TIA
Prvn
# 2  
Old 04-05-2008
Those are the tools I would use. Perhaps you could instead rearrange the directory structure so that all new files are in a "recent" directory tree, and then move them to the "main" directory tree when you no longer want to treat them as "recent"?

Which file system are you using? NTFS in particular is a real dog when it comes to coping with large directories. You might want to investigate whether it might make sense to switch to Reiser or XFS or something.

(I just happen to have a vague idea of what a lakh is, but you'd probably better avoid regional lingo like that in an international forum.)

Last edited by era; 04-05-2008 at 03:41 PM.. Reason: Wikipedia link for "lakh"
# 3  
Old 04-05-2008
If these are web uploads, perhaps it would be simpler to process the web server's log file?
# 4  
Old 04-05-2008
IF you have 10000 entries in a directory, in order to find files by ctime or mtime you have to stat all 10000 of them. If these are NSF mounted directories, it takes even longer, regardless of the remote filesystem type.

Either use the log as era suggests, use alter the app that sends the files to write a list of files to a text file in a central location.
# 5  
Old 04-05-2008
Hi Era,

Thanks for your reply.

using "recent" directory may not suit as we must pick the files list from actual location only.

We are running Solaris 9 with UFS file system.

They are not web or FTP uploads but just copied (using "cp" command)

Thanks
Prvn
# 6  
Old 04-05-2008
Maybe you could institute a policy to use "cp -v" and direct the output to a file?

Another solution which I guess might not be suitable for you would be to rearchitect the whole thing to use a database instead of the bare file system.

Dunno about Solaris but on Linux you can install a daemon which monitors the file system for you, and can keep track of which files have been created recently. Maybe you could find something like this for your system.
# 7  
Old 04-05-2008
Thanks Era.

Could you please let me know the daemons on linux for monitoring directories? I will try to get its source and compile on Solaris.

I know aide but i think it would take even more time to monitor (check) as i have millions of files in the directory to monitor.

Prvn
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Efficient way to search array in text file by awk

I have one array SPLNO with approx 10k numbers.Now i want to search the subscriber number from MDN.TXT file (containing approx 1.5 lac record)from the array.if subscriber number found in array it will perform below operation.my issue is that it's taking more time because for one number it's search... (6 Replies)
Discussion started by: siramitsharma
6 Replies

2. Shell Programming and Scripting

Search and replace multiple patterns in a particular column only - efficient script

Hi Bigshots, I have a pattern file with two columns. I have another data file. If column 1 in the pattern file appears as the 4th column in the data file, I need to replace it (4th column of data file) with column 2 of the pattern file. If the pattern is found in any other column, it should not... (6 Replies)
Discussion started by: ss112233
6 Replies

3. Shell Programming and Scripting

Need an efficient way to search for a tag in an xml file having millions of rows

Hi, I have an XML file with around 1 billion rows in it and i am trying to find the number of times a particular tag occurs in it. The solution i am using works but takes a lot of time (~1 hr) .Please help me with an efficient way to do this. Lets say the input file is <Root> ... (13 Replies)
Discussion started by: Sheel
13 Replies

4. Shell Programming and Scripting

Help with Efficient Looping

Hello guys My requirement is to read a file with parent-child relationship we need to iterate through each row to find its latest child. for eg. parent child ABC PQR PQR DEF DEF XYZ Expected Output ABC XYZ PQR XYZ DEF XYZ Script Logic : read parent from file seach child... (4 Replies)
Discussion started by: joshiamit
4 Replies

5. Shell Programming and Scripting

Better and efficient way to reverse search a file for first matched line number.

How to reverse search for a matched string in a file. Get line# of the first matched line. I am getting '2' into 'lineNum' variable. But it feels like I am using too many commands. Is there a better more efficiant way to do this on Unix? abc.log aaaaaaaaaaaaa bbbbbbbbbbbbb... (11 Replies)
Discussion started by: kchinnam
11 Replies

6. UNIX for Dummies Questions & Answers

Is this regex efficient?

I want to match the red portion: 9784323456787-Unknown Phrase with punctuation "Some other PhrASE." Is this the best regex to match this? '978\{10\}-*' (4 Replies)
Discussion started by: glev2005
4 Replies

7. UNIX for Advanced & Expert Users

efficient repace

some of the data i receive has been typed in manually due to which there are often places where i find 8 instead of ( and the incorrect use of case what according to you is the best way to correct such data. The data has around 20,000 records. The value i want to change is in the 4th field.... (2 Replies)
Discussion started by: VGR
2 Replies

8. Shell Programming and Scripting

efficient search

Hi, i have 2 files each with 200K lines. Each line contains a number. Now, i need to get the list of numbers existing in one fine and NOT in other file. I'm doing this by reading each number from 1 file and grepping on other file. But this taking LOT of time. Is there any efficient way of doing... (14 Replies)
Discussion started by: prvnrk
14 Replies

9. Shell Programming and Scripting

Is there a more efficient way?

I'm using korn shell to connect to oracle, retrieve certain values, put them in a list, and iterate through them. While this method works, I can't help but think there is an easier method. If you know of one, please suggest a shorter, more efficient method. ############### FUNCTIONS ... (6 Replies)
Discussion started by: SelectSplat
6 Replies

10. UNIX for Advanced & Expert Users

Efficient Dispatching

Does anyone know what's new with Efficient dispatching in the Solaris 2.8 release (vs Solaris 2.6) release? Specifically, does anyone know of a good website to get detailed information on thread dispatching using efficient dispatching in solaris 2.8? Thank you. (1 Reply)
Discussion started by: uchachra
1 Replies
Login or Register to Ask a Question