Using mtime is better while using ctime may not work. If a backup program has backed up the files, it will cause ctime to advance if it resets atime. If your backup program does reset atime in this manner, then recent atime values indicate that someone (other than the backup program) is reading the file and perhaps it should not be removed. So, assuming a reasonable backup policy, atime may be the best choice.
Now consider a directory structure like:
./a/b/c/datafile
If datafile is recently created while a/b/c was already existent, directories a and a/b are stable while a/b/c is being updated. Doing a "rm -rf ./a" just because ./a has not changed will forcibly remove ./a/b/c/datafile which, in this case, is a recent file. Using rmdir for directories solves that. But we must be prepared for the fact that ./a must be left alone even though if passes the -ctime test.
On the other hand, if ./a/b/c/datafile is all old stuff, removing datafile renders ./a/b/c recently changed. Building a complete list of removal candidates prior to removing anything solves that.
We need process ./a/b before we process ./a and this implies that we need -depth on the find statement.
We cannot divide the world into ordinary files and directories unless we really want the script to fail if we encounter a socket, fifo, special file, etc. Instead we need to think in terms of directories and non-directories.
So maybe something like this will get you closer, (but I have not tested it):
Hi All
I want to remove the files with name like data*.csv from the directory older than 10 days.
If there is no files exists to remove older than 10 days, It should not do anything.
Thanks
Jo (9 Replies)
Hi All,
Let me know how can i find and delete files from one unix server to another unix server which are 'N' days older.
Please note that I need to delete files on remote unix server.So, probably i will need to use sftp, but question is how can i identify files and folders which are 'N'... (2 Replies)
Hi All,
I am using below code to delete files older than 2 days. In case if there are no files, I should log an error saying no files to delete.
Please let me know, How I can achive this.
find /path/*.xml -mtime +2
Thanks and Regards
Nagaraja. (3 Replies)
Dear all,
i use incremental backup my data with .zip to my hard drive. what i need is i don't want the old .zip file older than 30 days. how to write a shell script automatically remove my external hard disc zip backup folders older than 30 days?
Regards, (2 Replies)
Hi all,
I want to delete log files with extension .log which are older than 30
days. How to delete those files?
Operating system -- Sun solaris 10
Your input is highly appreciated.
Thanks in advance.
Regards,
Williams (2 Replies)
i have to delete files which are older than 15 days or more except the ones in the directory Current and also *.sh files
i have found the command for files 15 days or more older
find . -type f -mtime +15 -exec ls -ltr {} \;
but how to implement the logic to avoid directory Current and also... (3 Replies)
I will like to write a script that delete all files that are older than 7 days in a directory and it's subdirectories. Can any one help me out witht the magic command or script?
Thanks in advance,
Odogboly98:confused: (3 Replies)