The most important thing to take into account is to avoid using a for-loop: you would run into an "argument list too long"-error. I have no directory with that many files at hand to test it, but the following should work (i still can't tell you about execution time, test it carefully):
if [ "$amount_files_input" -ge 100000 ] ; then for i in {1..10000} do file_to_move=`ls -1 $input_dir | tail -1` mv $input_dir/$file_to_move $output_dir done fi
what if in a folder we have thousands of files ,lets say for past 2 years...
how can i search files of 2009 and just move them to a separate folder ?
date format being : 2009-03-25....
can 'xargs' be used here? if yes,how ?
Since this is presumably a one time job. I would use something like:
ls -ltr >filelist
Then edit filelist and remove all the file names that you do not want to move.
Then write a script to read the edited file and move the files.
---------- Post updated at 11:20 AM ---------- Previous update was at 11:01 AM ----------
Quote:
Originally Posted by urandom
I like above solutions much better than mine, but I want to show my solution anyway, just to show that we have many solutions.
if [ "$amount_files_input" -ge 100000 ] ; then for i in {1..10000} do file_to_move=`ls -1 $input_dir | tail -1` mv $input_dir/$file_to_move $output_dir done fi
and of course I'm a newbie still
Yes it will work with less than 250 entries in the directory, but....
With thousands of entries in the directory, ls, might well take significant time to execute.
I tried the "ls -1 $input_dir |tail -1" line on a directory with 168000 files.
Real time was 0.50 seconds*. You execute this command 10000 times
*on a dual processor quad core system with serial SCSI RAID10
what if in a folder we have thousands of files ,lets say for past 2 years...
how can i search files of 2009 and just move them to a separate folder ?
date format being : 2009-03-25....
can 'xargs' be used here? if yes,how ?
It's probably best to start your own thread for this question. Aside from having to deal with a lot of files, it really has nothing to do with the original poster's problem and will only muddle the discussion, in my opinion.
Regards,
Alister
---------- Post updated at 04:48 PM ---------- Previous update was at 11:25 AM ----------
Hello, bakunin:
Quote:
Originally Posted by bakunin
The most important thing to take into account is to avoid using a for-loop: you would run into an "argument list too long"-error.
That's actually bad advice in general (there is absolutely no need to avoid using a for loop because the list may be very very long), and it's terrible advice in this particular case, where a glob in a for loop's list is by far the simplest and safest way to handle a directory of files (field splitting is not a concern since the glob is expanded during the penultimate step in shell command line processing, only quote removal follows it).
Your warning regarding "argument list too long" scenarios does not apply to a shell expanding a wildcard, which is done internally and does not require an exec system call. Nor does it apply to the for loop since that is also internal. Nor does it apply to any commands within the for loop since they are fed the list items one at a time. For more info regarding ARGMAX issues, The maximum length of arguments for a new process may be helpful.
To test for yourself, you can execute the following (if your system has jot, if not perhaps you can tweak it to use seq, or even brace expansion):
That will create 100,000 files, each with a 100 character filename, and then run a do-nothing loop which nevertheless has to expand the * wildcard.
Regarding your solution, there are some caveats: it will not properly handle any files which contain leading whitespace, embedded newlines, or a trailing backslash. The whitespace and trailing backslash can be fixed by tweaking IFS and using read's -r option. The embedded newline however cannot be worked around (at least not with any posix-compliant functionality in ls/read that I'm aware of).
On an unrelated note, the following idiom will always return a 0 exit status, since when the exit command executes, the value of $? is the exit status of the [ command, which must have succeeded if the exit has been reached.
Quote:
Originally Posted by bakunin
A simple, correct way would be:
exit will only execute if mv fails, and it will return mv's exit status (the last command run).
The solution you provided also assumes that all the target directories exist, since there is no mkdir anywhere (that may be intentional, I'm just pointing it out to save the original poster some time
Please don't take the above criticisms personally; they are intended to be helpful. If my analysis is erroneous, I would appreciate being corrected.
Regards,
Alister
---------- Post updated at 04:57 PM ---------- Previous update was at 04:48 PM ----------
Hello, jgt:
Quote:
Originally Posted by jgt
This solution is broken with regard to whitespace. $file in the mv should be double-quoted. There are also problems with the way read is used, which will mangle filenames with leading whitespace, embedded newlines, and a trailing backlash.
I mention it only in case the original poster's monster directory has some susceptible filenames.
Regards,
Alister
---------- Post updated at 05:23 PM ---------- Previous update was at 04:57 PM ----------
My attempt at a solution. It should handle any filenames without issue. The only downside is that it must execute mv once per file to move. However, I'll take the performance hit over possible breakage when mv'ing 100K files one at a time only takes about 5 minutes on a 3 yr old laptop with a slow drive. It operates on the current working directory and also creates the necessary destination directories (001, 002, ...) in the current working directory.
Creating 100,000 files with 100 character long filenames:
A test run, followed by a some simple checks:
Regards,
Alister
Hello,
I'm a first time poster looking for help in scripting a task in my daily routine. I am new in unix but i am attracted to its use as a mac user.
Bear with me...
I have several files (20) that I manually drag via the mouse into several named directories over a network. I've used rsync... (14 Replies)
I've got this script to loop through all folders and move files that are more than 2 years old. I'm using the install command because it creates the necessary directories on the destination path and then I remove the source. I'd like to change the script to use the mv command since it is much... (4 Replies)
Hi,
In a parent directory there are several files in the form
IDENTIFIER1x
IDENTIFIER1.yyy
IDENTIFIER1_Z, etc
IDENTIFIER2x
IDENTIFIER2.yyy
IDENTIFIER2_Z, etc
IDENTIFIER3x
IDENTIFIER3.yyy,
IDENTIFIER3_Z, etcIn the same parent directory there are corresponding directories named... (7 Replies)
I would like to transfer all files ending with .log from /tmp and to /tmp/archive (using find )
The directory structure looks like :-
/tmp
a.log
b.log
c.log
/abcd
d.log
e.log
When I tried the following command , it movies all the log files... (8 Replies)
I have just purchased my first ever Apple computer - and am therefore new to UNIX also.
I would like to create a simple "batch file" (apologies if this is the wrong terminology) to do the following:
When I plug my camera into the MAC it automatically downloads photos and videos into a new... (1 Reply)
Hi,
I have various log files in different paths. e.g.
a/b/c/d/e/server.log
a/b/c/d/f/server.log
a/b/c/d/g/server.log
a/b/c/h/e/server.log
a/b/c/h/f/server.log
a/b/c/h/g/server.log
a/b/c/i/e/server.log
a/b/c/i/e/server.log
a/b/c/i/e/server.log
and above these have an archive folder... (6 Replies)
I want to move the files in a dir to different dirs based on their file names.
Ex: i have 4 different files with name -
CTS_NONE_10476031_MRL_PFT20081215a.txt
CTS_NONE_10633009_MRL_PFT20091020a.txt
CTS_NONE_10345673_MRL_PFT20081215a.txt
CTS_NONE_10872456_MRL_PFT20091020a.txt
and the 1st... (4 Replies)
I want to move the files in a dir to different dirs based on their file names.
Ex: i have 4 different files with name -
CTS_NONE_10476031_MRL_PFT20081215a.txt
CTS_NONE_10633009_MRL_PFT20091020a.txt
CTS_NONE_10345673_MRL_PFT20081215a.txt
CTS_NONE_10872456_MRL_PFT20091020a.txt
and the 1st... (2 Replies)
Hello, this is probably another really simple tasks for most of you gurus, however I am trying to make a script which takes an input, greps a specific file for that input, prints back to screen the results (which are directory names) and then be able to use the directory names to move files.... (1 Reply)
Hi,
I need to move a certain number of files every 10 minutes from one folder to another. I have written the script below, however its not working, please advise.
#! /bin/ksh
start()
{
mv /test1/$(head -1000 /movetst) /test2/
sleep 600
}
stop()
{
exit
}
ls ti* >... (1 Reply)