Sponsored Content
Full Discussion: deleting older directories
Top Forums UNIX for Dummies Questions & Answers deleting older directories Post 302158614 by frank_rizzo on Wednesday 16th of January 2008 12:47:57 AM
Old 01-16-2008
your system probably does not support it. What OS are you using? Use the first method instead. It should work for you unless you run into weird filenames.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Deleting files older than a given date

Hi all, I want to delete all files in a directory which are older than a given date. I thought of doing it by creating a file by the required date by using touch command. And then i would use find command on that file and try to find files older than that. I searched the man and found a... (3 Replies)
Discussion started by: rajugp1
3 Replies

2. Solaris

Deleting Files Older than 24 hours

Hi, I am using Solaris Box, I need to delete file(cookies.html) from the path(/usr/temp) which are older than 24 hours(I want in hours, not in days) Can u provide the command for the above query (7 Replies)
Discussion started by: mazhar803
7 Replies

3. Shell Programming and Scripting

Script for deleting 30 days older

Hi, I need a script to delete files that are 30 days older and also the file name should contain aa or ab or ac as substring... Regards, Dolly.... (3 Replies)
Discussion started by: moon_friend
3 Replies

4. Shell Programming and Scripting

Deleting files older than 7 days

Hi Guys, I want to delete folder/files older than 7 days. Im using the command below. find /test/test1 -mtime +7 -print0 | xargs -0 rm -Rf /test/test1/* which works ok, but it deletes the test1 folder as well which i dont want. The test1 folder will have a list of sub-folders which in... (4 Replies)
Discussion started by: shezam
4 Replies

5. Shell Programming and Scripting

Script for parsing directories one level and finding directories older than n days

Hello all, Here's the deal...I have one directory with many subdirs and files. What I want to find out is who is keeping old files and directories...say files and dirs that they didn't use since a number of n days, only one level under the initial dir. Output to a file. A script for... (5 Replies)
Discussion started by: ejianu
5 Replies

6. UNIX for Advanced & Expert Users

Deleting older files of a particular type

hi This should be easy but i'm obviously missing something obvious. :) I'm looking to delete files from yesterday and older of extension .txt and there a range of subfolders with these files in them. The command runs but doesn't delete anything. SUSE 10. find /testfolder -maxdepth 2 -type f... (6 Replies)
Discussion started by: cmap
6 Replies

7. Shell Programming and Scripting

deleting files older than 7 days

Hi Guys, I am new to unix I am looking for a script to delete files older than 7 days but i also want to exclude certain directories (like arch,log .....) and also some files with extensions ( like .ksh, .ch, ..............) Thanks (1 Reply)
Discussion started by: MAYAMAYA0451
1 Replies

8. Shell Programming and Scripting

Deleting files older than 6 hours

Hi All, I am using the below script to find all the files in a folder which are older than 6 hours and delete all those files, but some how I am not getting the required output. find $HOME/Log -type f -name "*.log" -amin +360 -exec rm *.* {} \ can any one please check and let me know... (13 Replies)
Discussion started by: subhasri_2020
13 Replies

9. Shell Programming and Scripting

Deleting Files Older than 1 hours.

How to Deleting Files Older than 1 hours. Base on SunOS. this file gen every 1 min. -rw-r--r-- 1 nobody nobody 4960 Jan 27 02:02 23_201301270201.log -rw-r--r-- 1 nobody amudu 2325 Jan 27 02:03 33_201301270202.log -rw-r--r-- 1 nobody amudu 3255 Jan 27 02:03... (2 Replies)
Discussion started by: ooilinlove
2 Replies

10. Shell Programming and Scripting

Need help deleting files one week older

Hi, I need to delete *.bad files which are 1 week old. How can I achieve that. I tried doing through below script but it deletes all the files. find ./ -mtime +7 -exec rm *.bad {} \; The below one works but i want to delete only files with .bad extension find . -mtime +7 | xargs rm (2 Replies)
Discussion started by: Gangadhar Reddy
2 Replies
STOREBACKUPDEL(1)					User Contributed Perl Documentation					 STOREBACKUPDEL(1)

NAME
storeBackupDel.pl - this program deletes backups created by storeBackup SYNOPSIS
storeBackupDel.pl [-f configFile] [--print] [-b backupDirectory] [-S series] [--doNotDelete] [--deleteNotFinishedDirs] [-L lockFile] [--keepAll timePeriod] [--keepWeekday entry] [--keepFirstOfYear] [--keepLastOfYear] [--keepFirstOfMonth] [--keepLastOfMonth] [--keepFirstOfWeek] [--keepLastOfWeek] [--keepDuplicate] [--keepMinNumber] [--keepMaxNumber] [-l logFile [--plusLogStdout] [--suppressTime] [-m maxFilelen] [[-n noOfOldFiles] | [--saveLogs] [--compressWith compressprog]] WARNING
!!! USAGE IN PARALLEL WITH storeBackup.pl CAN DESTROY YOUR BACKUPS !!! OPTIONS
--file, -f configuration file (instead of parameters) --print print configuration read from configuration file and stop --backupDir, -b top level directory of all backups (must exist) --series, -S directory of backup series same parameter as in storeBackup / relative path from backupDir, default is 'default' --lockFile, -L lock file, if exists, new instances will finish if an old is already running, default is $lockFile --doNotDelete test only, do not delete any backup --deleteNotFinishedDirs delete old backups which where not finished this will not happen if doNotDelete is set --keepAll keep backups which are not older than the specified amount of time. This is like a default value for all days in --keepWeekday. Begins deleting at the end of the script the time range has to be specified in format 'dhms', e.g. 10d4h means 10 days and 4 hours default = $keepAll; --keepWeekday keep backups for the specified days for the specified amount of time. Overwrites the default values chosen in --keepAll. 'Mon,Wed:40d Sat:60d10m' means: keep backups of Mon and Wed 40days + 5mins keep backups of Sat 60days + 10mins keep backups of the rest of the days like spcified in --keepAll (default $keepAll) if you also use the 'archive flag' it means to not delete the affected directories via --keepMaxNumber: a10d4h means 10 days and 4 hours and 'archive flag' e.g. 'Mon,Wed:a40d Sat:60d10m' means: keep backups of Mon and Wed 40days + 5mins + 'archive' keep backups of Sat 60days + 10mins keep backups of the rest of the days like specified in --keepAll (default $keepAll) --keepFirstOfYear do not delete the first backup of a year format is timePeriod with possible 'archive flag' --keepLastOfYear do not delete the last backup of a year format is timePeriod with possible 'archive flag' --keepFirstOfMonth do not delete the first backup of a month format is timePeriod with possible 'archive flag' --keepLastOfMonth do not delete the last backup of a month format is timePeriod with possible 'archive flag' --firstDayOfWeek default: 'Sun'. This value is used for calculating --keepFirstOfWeek and --keepLastOfWeek --keepFirstOfWeek do not delete the first backup of a week format is timePeriod with possible 'archive flag' --keepLastOfWeek do not delete the last backup of a week format is timePeriod with possible 'archive flag' --keepDuplicate keep multiple backups of one day up to timePeriod format is timePeriod, 'archive flag' is not possible default = $keepDuplicate; --keepMinNumber Keep that miminum of backups. Multiple backups of one day are counted as one backup. Default is 10. --keepMaxNumber Try to keep only that maximum of backups. If you have more backups, the following sequence of deleting will happen: - delete all duplicates of a day, beginning with the old once, except the oldest of every day - if this is not enough, delete the rest of the backups beginning with the oldest, but *never* a backup with the 'archive flag' or the last backup --keepRelative, -R Alternative deletion scheme. If you use this option, all other keep options are ignored. Preserves backups depending on their *relative* age. Example: -R '1d 7d 2m 3m' will (try to) ensure that there is always - One backup between 1 day and 7 days old - One backup between 5 days and 2 months old - One backup between 2 months and 3 months old If there is no backup for a specified timespan (e.g. because the last backup was done more than 2 weeks ago) the next older backup will be used for this timespan. --logFile, -l log file (default is STDOUT) --plusLogStdout if you specify a log file with --logFile you can additionally print the output to STDOUT with this flag --suppressTime suppress output of time in logfile --maxFilelen, -m maximal length of file, default = 1e6 --noOfOldFiles, -n number of old log files, default = 5 --saveLogs save log files with date and time instead of deleting the old (with [-noOldFiles]) --compressWith compress saved log files (e.g. with 'gzip -9') default is 'bzip2' COPYRIGHT
Copyright (c) 2003-2008 by Heinz-Josef Claes (see README). Published under the GNU General Public License v3 or any later version perl v5.14.2 2012-06-16 STOREBACKUPDEL(1)
All times are GMT -4. The time now is 10:41 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy