Sponsored Content
Operating Systems AIX Delete old file and keep a few file Post 302234599 by wanasmadi on Wednesday 10th of September 2008 02:03:01 AM
Old 09-10-2008
Quote:
Originally Posted by dennis.jacob
Why can't try find?

Code:
find dir_name -name "filename_pattern" -mtime +60 -exec rm {} \;


Hi dennis.jacob

I already try this command, but still can not
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Delete block of text in one file based on list in another file

Hi all I currently use the following in shell. #!/bin/sh while read LINE do perl -i -ne "$/ = ''; print if !m'Using archive: ${LINE}'ms;" "datafile" done < "listfile" NOTE the single quote delimiters in the expression. It's highly likely the 'LINE' may very well have characters in it... (3 Replies)
Discussion started by: Festus Hagen
3 Replies

2. Shell Programming and Scripting

How to delete a string pattern in a file and write back to the same file

I have a control file which looks like this LOAD DATA INFILE '/array/data/data_Finished_T5_col_change/home/oracle/emp.dat' PRESERVE BLANKS INTO TABLE SCOTT.EMP FIELDS TERMINATED BY '|' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS (................. ..................) How can i edit the... (1 Reply)
Discussion started by: mwrg
1 Replies

3. Solaris

Before I delete any file in Unix, How can I check no open file handle is pointing to that file?

I know how to check if any file has a unix process using a file by looking at 'lsof <fullpath/filename>' command. I think using lsof is very expensive. Also to make it accurate we need to inlcude fullpath of the file. Is there another command that can tell if a file has a truely active... (12 Replies)
Discussion started by: kchinnam
12 Replies

4. Homework & Coursework Questions

how to delete core file (file created due to apps crashed)

1. The problem statement, all variables and given/known data: When looking for corefiles, include any file with core in its name. (Some UNIX/Linux systems add the PID of the process that created the core to reduce the chances of overwriting an already existing core file that might be needed. The... (6 Replies)
Discussion started by: s3270226
6 Replies

5. Shell Programming and Scripting

Delete a pattern present in file 2 from file 1 if found in file 1.

I have two files File1 ==== 1|2000-00-00|2010-02-02|| 2| 00:00:00|2012-02-24|| 3|2000-00-00|2011-02-02|| File2 ==== 2000-00-00 00:00:00 I want the delete the patterns which are found in file 2 from file 1, Expected output: File1 ==== (5 Replies)
Discussion started by: machomaddy
5 Replies

6. Shell Programming and Scripting

Need unix commands to delete records from one file if the same record present in another file...

Need unix commands to delete records from one file if the same record present in another file... just like join ... if the record present in both files.. delete from first file or delete the particular record and write the unmatched records to new file.. tried with grep and while... (6 Replies)
Discussion started by: msathees
6 Replies

7. UNIX for Dummies Questions & Answers

To delete the oldest files in a file when file count in the folder exceeds 7

Hi All, I need to delete the oldest file in folder when the file count in the folder exceed 6 ( i have a process that puts the source files into this folder ) E.x : Folder : /data/opt/backup 01/01/2012 a.txt 01/02/2012 b.txt ... (1 Reply)
Discussion started by: akshay01987
1 Replies

8. Shell Programming and Scripting

Compare two string in two separate file and delete some line of file

Hi all i want to write program with shell script that able compare two file content and if one of lines of file have # at the first of string or nothing find same string in one of two file . remove the line in second file that have not the string in first file. for example: file... (2 Replies)
Discussion started by: saleh67
2 Replies

9. UNIX for Dummies Questions & Answers

Delete files whose file names are listed in a .txt file

hi, I need a help. I used this command to list all the log files which are for more than 10 days to a text file. find /usr/script_test -type f -mtime +10>>/usr/ftprm.txt I want all these files listed in the ftprm.txt to be ftp in another machine and then rm the files. Anyone can help me... (8 Replies)
Discussion started by: kamaldev
8 Replies

10. UNIX for Dummies Questions & Answers

How to read a file without opening the file and delete last line?

I have file called "text". The contents are as below : aaa bbb ccc ddd eee ffff ddd hhhh iiii I want to read this file without opening and and delete the last line. How can it be done? (4 Replies)
Discussion started by: the_hunter
4 Replies
STOREBACKUPDEL(1)					User Contributed Perl Documentation					 STOREBACKUPDEL(1)

NAME
storeBackupDel.pl - this program deletes backups created by storeBackup SYNOPSIS
storeBackupDel.pl [-f configFile] [--print] [-b backupDirectory] [-S series] [--doNotDelete] [--deleteNotFinishedDirs] [-L lockFile] [--keepAll timePeriod] [--keepWeekday entry] [--keepFirstOfYear] [--keepLastOfYear] [--keepFirstOfMonth] [--keepLastOfMonth] [--keepFirstOfWeek] [--keepLastOfWeek] [--keepDuplicate] [--keepMinNumber] [--keepMaxNumber] [-l logFile [--plusLogStdout] [--suppressTime] [-m maxFilelen] [[-n noOfOldFiles] | [--saveLogs] [--compressWith compressprog]] WARNING
!!! USAGE IN PARALLEL WITH storeBackup.pl CAN DESTROY YOUR BACKUPS !!! OPTIONS
--file, -f configuration file (instead of parameters) --print print configuration read from configuration file and stop --backupDir, -b top level directory of all backups (must exist) --series, -S directory of backup series same parameter as in storeBackup / relative path from backupDir, default is 'default' --lockFile, -L lock file, if exists, new instances will finish if an old is already running, default is $lockFile --doNotDelete test only, do not delete any backup --deleteNotFinishedDirs delete old backups which where not finished this will not happen if doNotDelete is set --keepAll keep backups which are not older than the specified amount of time. This is like a default value for all days in --keepWeekday. Begins deleting at the end of the script the time range has to be specified in format 'dhms', e.g. 10d4h means 10 days and 4 hours default = $keepAll; --keepWeekday keep backups for the specified days for the specified amount of time. Overwrites the default values chosen in --keepAll. 'Mon,Wed:40d Sat:60d10m' means: keep backups of Mon and Wed 40days + 5mins keep backups of Sat 60days + 10mins keep backups of the rest of the days like spcified in --keepAll (default $keepAll) if you also use the 'archive flag' it means to not delete the affected directories via --keepMaxNumber: a10d4h means 10 days and 4 hours and 'archive flag' e.g. 'Mon,Wed:a40d Sat:60d10m' means: keep backups of Mon and Wed 40days + 5mins + 'archive' keep backups of Sat 60days + 10mins keep backups of the rest of the days like specified in --keepAll (default $keepAll) --keepFirstOfYear do not delete the first backup of a year format is timePeriod with possible 'archive flag' --keepLastOfYear do not delete the last backup of a year format is timePeriod with possible 'archive flag' --keepFirstOfMonth do not delete the first backup of a month format is timePeriod with possible 'archive flag' --keepLastOfMonth do not delete the last backup of a month format is timePeriod with possible 'archive flag' --firstDayOfWeek default: 'Sun'. This value is used for calculating --keepFirstOfWeek and --keepLastOfWeek --keepFirstOfWeek do not delete the first backup of a week format is timePeriod with possible 'archive flag' --keepLastOfWeek do not delete the last backup of a week format is timePeriod with possible 'archive flag' --keepDuplicate keep multiple backups of one day up to timePeriod format is timePeriod, 'archive flag' is not possible default = $keepDuplicate; --keepMinNumber Keep that miminum of backups. Multiple backups of one day are counted as one backup. Default is 10. --keepMaxNumber Try to keep only that maximum of backups. If you have more backups, the following sequence of deleting will happen: - delete all duplicates of a day, beginning with the old once, except the oldest of every day - if this is not enough, delete the rest of the backups beginning with the oldest, but *never* a backup with the 'archive flag' or the last backup --keepRelative, -R Alternative deletion scheme. If you use this option, all other keep options are ignored. Preserves backups depending on their *relative* age. Example: -R '1d 7d 2m 3m' will (try to) ensure that there is always - One backup between 1 day and 7 days old - One backup between 5 days and 2 months old - One backup between 2 months and 3 months old If there is no backup for a specified timespan (e.g. because the last backup was done more than 2 weeks ago) the next older backup will be used for this timespan. --logFile, -l log file (default is STDOUT) --plusLogStdout if you specify a log file with --logFile you can additionally print the output to STDOUT with this flag --suppressTime suppress output of time in logfile --maxFilelen, -m maximal length of file, default = 1e6 --noOfOldFiles, -n number of old log files, default = 5 --saveLogs save log files with date and time instead of deleting the old (with [-noOldFiles]) --compressWith compress saved log files (e.g. with 'gzip -9') default is 'bzip2' COPYRIGHT
Copyright (c) 2003-2008 by Heinz-Josef Claes (see README). Published under the GNU General Public License v3 or any later version perl v5.14.2 2012-06-16 STOREBACKUPDEL(1)
All times are GMT -4. The time now is 11:08 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy