9 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
As one of our requirement was to connect to remote Linux server through SFTP connection and delete some files which are older than 7 days.
I used the below piece of code for that,
SFTP_CONNECTION=`sftp user_id@host ...
cd DESIRED_DIR;
find /path/to/files* -mtime +5 -exec rm -rf {} \;... (1 Reply)
Discussion started by: ATWC
1 Replies
2. Shell Programming and Scripting
Hi All,
I am using below code to delete files older than 2 days. In case if there are no files, I should log an error saying no files to delete.
Please let me know, How I can achive this.
find /path/*.xml -mtime +2
Thanks and Regards
Nagaraja. (3 Replies)
Discussion started by: Nagaraja Akkiva
3 Replies
3. Solaris
Hi all,
I want to delete log files with extension .log which are older than 30
days. How to delete those files?
Operating system -- Sun solaris 10
Your input is highly appreciated.
Thanks in advance.
Regards,
Williams (2 Replies)
Discussion started by: William1482
2 Replies
4. Shell Programming and Scripting
i have to delete files which are older than 15 days or more except the ones in the directory Current and also *.sh files
i have found the command for files 15 days or more older
find . -type f -mtime +15 -exec ls -ltr {} \;
but how to implement the logic to avoid directory Current and also... (3 Replies)
Discussion started by: ali560045
3 Replies
5. Shell Programming and Scripting
Guys,
I had raised a question about deleting files older than today in a specific directory and i got this as an answer
find ${ARCH_DEST}/*.gz -mtime +0 -exec rm -f {} \;
What happens when there aren't files that meet this criteria ? Can it delete any other directories ? I had a shocking... (22 Replies)
Discussion started by: kamathg
22 Replies
6. UNIX for Dummies Questions & Answers
This is driving me crazy. How can I delete files in a specifc directory that are over 30 days old? Thanks in advance. (3 Replies)
Discussion started by: tlphillips
3 Replies
7. Shell Programming and Scripting
find /basedirectory -type f -mtime +3 >> /tmp/tempfile
find /basedirectory -type d -mtime +3 >> /tmp/tempfile
mailx -s "List of removed files and folders" myemail@domain.com < /tmp/te
mpfile
rm /tmp/tempfile
find /basedirectory -type f -mtime +3 -exec rm {} \;
find /basedirectory -type d... (7 Replies)
Discussion started by: melanie_pfefer
7 Replies
8. UNIX for Dummies Questions & Answers
I will like to write a script that delete all files that are older than 7 days in a directory and it's subdirectories. Can any one help me out witht the magic command or script?
Thanks in advance,
Odogboly98:confused: (3 Replies)
Discussion started by: odogbolu98
3 Replies
9. UNIX for Dummies Questions & Answers
can anyone tell me how I would write a script in ksh on AIX that will delete files in a directory older than 7 days? (1 Reply)
Discussion started by: lesstjm
1 Replies
STOREBACKUPDEL(1) User Contributed Perl Documentation STOREBACKUPDEL(1)
NAME
storeBackupDel.pl - this program deletes backups created by storeBackup
SYNOPSIS
storeBackupDel.pl [-f configFile] [--print]
[-b backupDirectory] [-S series] [--doNotDelete]
[--deleteNotFinishedDirs] [-L lockFile]
[--keepAll timePeriod] [--keepWeekday entry] [--keepFirstOfYear]
[--keepLastOfYear] [--keepFirstOfMonth] [--keepLastOfMonth]
[--keepFirstOfWeek] [--keepLastOfWeek]
[--keepDuplicate] [--keepMinNumber] [--keepMaxNumber]
[-l logFile
[--plusLogStdout] [--suppressTime] [-m maxFilelen]
[[-n noOfOldFiles] | [--saveLogs]
[--compressWith compressprog]]
WARNING
!!! USAGE IN PARALLEL WITH storeBackup.pl CAN DESTROY YOUR BACKUPS !!!
OPTIONS
--file, -f
configuration file (instead of parameters)
--print
print configuration read from configuration file and stop
--backupDir, -b
top level directory of all backups (must exist)
--series, -S
directory of backup series
same parameter as in storeBackup / relative path
from backupDir, default is 'default'
--lockFile, -L
lock file, if exists, new instances will finish if
an old is already running, default is $lockFile
--doNotDelete
test only, do not delete any backup
--deleteNotFinishedDirs
delete old backups which where not finished
this will not happen if doNotDelete is set
--keepAll
keep backups which are not older than the specified amount
of time. This is like a default value for all days in
--keepWeekday. Begins deleting at the end of the script
the time range has to be specified in format 'dhms', e.g.
10d4h means 10 days and 4 hours
default = $keepAll;
--keepWeekday
keep backups for the specified days for the specified
amount of time. Overwrites the default values chosen in
--keepAll. 'Mon,Wed:40d Sat:60d10m' means:
keep backups of Mon and Wed 40days + 5mins
keep backups of Sat 60days + 10mins
keep backups of the rest of the days like spcified in
--keepAll (default $keepAll)
if you also use the 'archive flag' it means to not
delete the affected directories via --keepMaxNumber:
a10d4h means 10 days and 4 hours and 'archive flag'
e.g. 'Mon,Wed:a40d Sat:60d10m' means:
keep backups of Mon and Wed 40days + 5mins + 'archive'
keep backups of Sat 60days + 10mins
keep backups of the rest of the days like specified in
--keepAll (default $keepAll)
--keepFirstOfYear
do not delete the first backup of a year
format is timePeriod with possible 'archive flag'
--keepLastOfYear
do not delete the last backup of a year
format is timePeriod with possible 'archive flag'
--keepFirstOfMonth
do not delete the first backup of a month
format is timePeriod with possible 'archive flag'
--keepLastOfMonth
do not delete the last backup of a month
format is timePeriod with possible 'archive flag'
--firstDayOfWeek
default: 'Sun'. This value is used for calculating
--keepFirstOfWeek and --keepLastOfWeek
--keepFirstOfWeek
do not delete the first backup of a week
format is timePeriod with possible 'archive flag'
--keepLastOfWeek
do not delete the last backup of a week
format is timePeriod with possible 'archive flag'
--keepDuplicate
keep multiple backups of one day up to timePeriod
format is timePeriod, 'archive flag' is not possible
default = $keepDuplicate;
--keepMinNumber
Keep that miminum of backups. Multiple backups of one
day are counted as one backup. Default is 10.
--keepMaxNumber
Try to keep only that maximum of backups. If you have
more backups, the following sequence of deleting will
happen:
- delete all duplicates of a day, beginning with the
old once, except the oldest of every day
- if this is not enough, delete the rest of the backups
beginning with the oldest, but *never* a backup with
the 'archive flag' or the last backup
--keepRelative, -R
Alternative deletion scheme. If you use this option, all other
keep options are ignored. Preserves backups depending
on their *relative* age. Example:
-R '1d 7d 2m 3m'
will (try to) ensure that there is always
- One backup between 1 day and 7 days old
- One backup between 5 days and 2 months old
- One backup between 2 months and 3 months old
If there is no backup for a specified timespan
(e.g. because the last backup was done more than 2 weeks
ago) the next older backup will be used for this timespan.
--logFile, -l
log file (default is STDOUT)
--plusLogStdout
if you specify a log file with --logFile you can
additionally print the output to STDOUT with this flag
--suppressTime
suppress output of time in logfile
--maxFilelen, -m
maximal length of file, default = 1e6
--noOfOldFiles, -n
number of old log files, default = 5
--saveLogs
save log files with date and time instead of deleting the
old (with [-noOldFiles])
--compressWith
compress saved log files (e.g. with 'gzip -9')
default is 'bzip2'
COPYRIGHT
Copyright (c) 2003-2008 by Heinz-Josef Claes (see README). Published under the GNU General Public License v3 or any later version
perl v5.14.2 2012-06-16 STOREBACKUPDEL(1)