Looking more closely at the standard, when using -exec command +, the primary always returns success even if one or more executions of the given command fail. Furthermore, if there are two or more primaries like this, the order in which they are executed is unspecified. (So, the rm commands for a group of files could be executed before, after, or simultaneously with the ls commands for the same or different groups of files.) I don't understand why this would keep the rm commands from working, but it could cause files to be removed before they were listed by ls.
When the echo was in the -exec echo rm ..., did you see any output from the echo?
Does the following semicolon version work?
If not; what is the output from the command:
This User Gave Thanks to Don Cragun For This Post:
Looking more closely at the standard, when using -exec command +, the primary always returns success even if one or more executions of the given command fail. Furthermore, if there are two or more primaries like this, the order in which they are executed is unspecified. (So, the rm commands for a group of files could be executed before, after, or simultaneously with the ls commands for the same or different groups of files.) I don't understand why this would keep the rm commands from working, but it could cause files to be removed before they were listed by ls.
When the echo was in the -exec echo rm ..., did you see any output from the echo?
Does the following semicolon version work?
If not; what is the output from the command:
Hi,
I am trying to run a command that finds all files over x amount of days, issue is one of the directories has spaces within it.
find /files/target directory/*/* -type f -mtime +60 When running the above the usual error message is thrown back
+ find '/files/target\' 'directory/*/*' -type... (1 Reply)
As one of our requirement was to connect to remote Linux server through SFTP connection and delete some files which are older than 7 days.
I used the below piece of code for that,
SFTP_CONNECTION=`sftp user_id@host ...
cd DESIRED_DIR;
find /path/to/files* -mtime +5 -exec rm -rf {} \;
bye... (2 Replies)
Hi All
I want to remove the files with name like data*.csv from the directory older than 10 days.
If there is no files exists to remove older than 10 days, It should not do anything.
Thanks
Jo (9 Replies)
Hi All,
I am using below code to delete files older than 2 days. In case if there are no files, I should log an error saying no files to delete.
Please let me know, How I can achive this.
find /path/*.xml -mtime +2
Thanks and Regards
Nagaraja. (3 Replies)
Hi all,
I want to delete log files with extension .log which are older than 30
days. How to delete those files?
Operating system -- Sun solaris 10
Your input is highly appreciated.
Thanks in advance.
Regards,
Williams (2 Replies)
i have to delete files which are older than 15 days or more except the ones in the directory Current and also *.sh files
i have found the command for files 15 days or more older
find . -type f -mtime +15 -exec ls -ltr {} \;
but how to implement the logic to avoid directory Current and also... (3 Replies)
I will like to write a script that delete all files that are older than 7 days in a directory and it's subdirectories. Can any one help me out witht the magic command or script?
Thanks in advance,
Odogboly98:confused: (3 Replies)