07-21-2008
Thank you Franklin.
The ARCH_DEST is obtained from within a Oracle database, but accidentally the database down when this happened. Could it be that it ran the command for all the directories ?
Also, if no .gz files are available, the the return code is non-zero. basically, i am trying to see if the deletion .gz is successful and if it fails, report it as an error.
Something like this
Delete_Archive() {
LogDest="select value from v\$parameter where name='standby_archive_dest';"
runSql "${LogDest}"
ARCH_DEST="${COUNT}"
cd ${ARCH_DEST}
echo `pwd`
#find ${ARCH_DEST}/*.gz -mtime +0 -exec rm -f {} \;
rm *.gz
RC2=$?
if [[ $RC2 != 0 ]]
then
Mail_Subject="Dataguard on host `hostname`. Delete archive logs older than 24 hrs failed. Please investigate (SEV-4)"
ls -ltr | ${ARCH_DEST} > "${ORACLE_BASE}/oralogs/DELETE_ARCHIVE.log"
Mail_User "$Mail_Subject" "${ORACLE_BASE}/oralogs/DELETE_ARCHIVE.log"
exit 1
fi
}
Can this be coded better ? Can i put a condition saying if the number of files deleted is 0, then it's not an error ?
Thanks
9 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
can anyone tell me how I would write a script in ksh on AIX that will delete files in a directory older than 7 days? (1 Reply)
Discussion started by: lesstjm
1 Replies
2. UNIX for Dummies Questions & Answers
I will like to write a script that delete all files that are older than 7 days in a directory and it's subdirectories. Can any one help me out witht the magic command or script?
Thanks in advance,
Odogboly98:confused: (3 Replies)
Discussion started by: odogbolu98
3 Replies
3. Shell Programming and Scripting
find /basedirectory -type f -mtime +3 >> /tmp/tempfile
find /basedirectory -type d -mtime +3 >> /tmp/tempfile
mailx -s "List of removed files and folders" myemail@domain.com < /tmp/te
mpfile
rm /tmp/tempfile
find /basedirectory -type f -mtime +3 -exec rm {} \;
find /basedirectory -type d... (7 Replies)
Discussion started by: melanie_pfefer
7 Replies
4. UNIX for Dummies Questions & Answers
This is driving me crazy. How can I delete files in a specifc directory that are over 30 days old? Thanks in advance. (3 Replies)
Discussion started by: tlphillips
3 Replies
5. Shell Programming and Scripting
i have to delete files which are older than 15 days or more except the ones in the directory Current and also *.sh files
i have found the command for files 15 days or more older
find . -type f -mtime +15 -exec ls -ltr {} \;
but how to implement the logic to avoid directory Current and also... (3 Replies)
Discussion started by: ali560045
3 Replies
6. Solaris
Hi all,
I want to delete log files with extension .log which are older than 30
days. How to delete those files?
Operating system -- Sun solaris 10
Your input is highly appreciated.
Thanks in advance.
Regards,
Williams (2 Replies)
Discussion started by: William1482
2 Replies
7. Shell Programming and Scripting
Hi All,
I am using below code to delete files older than 2 days. In case if there are no files, I should log an error saying no files to delete.
Please let me know, How I can achive this.
find /path/*.xml -mtime +2
Thanks and Regards
Nagaraja. (3 Replies)
Discussion started by: Nagaraja Akkiva
3 Replies
8. Shell Programming and Scripting
Hi All
I want to remove the files with name like data*.csv from the directory older than 10 days.
If there is no files exists to remove older than 10 days, It should not do anything.
Thanks
Jo (9 Replies)
Discussion started by: rajeshjohney
9 Replies
9. UNIX for Advanced & Expert Users
As one of our requirement was to connect to remote Linux server through SFTP connection and delete some files which are older than 7 days.
I used the below piece of code for that,
SFTP_CONNECTION=`sftp user_id@host ...
cd DESIRED_DIR;
find /path/to/files* -mtime +5 -exec rm -rf {} \;
bye... (2 Replies)
Discussion started by: ATWC
2 Replies
LEARN ABOUT OSF1
db_archive
db_archive(8) System Manager's Manual db_archive(8)
NAME
db_archive - displays security database log files no longer involved in active transactions (Enhanced Security)
SYNOPSIS
/usr/tcb/bin/db_archive [-alsv] [-h home]
FLAGS
Write all pathnames as absolute pathnames, instead of relative to the database home directories. Specify a home directory for the data-
base. The correct directory for enhanced security is /var/tcb/files. Write out the pathnames of all of the database log files, whether or
not they are involved in active transactions. Write the pathnames of all of the database files that need to be archived in order to
recover the database from catastrophic failure. If any of the database files have not been accessed during the lifetime of the current log
files, db_archive does not include them in this output.
It is possible that some of the files referenced in the log have since been deleted from the system. In this case, db_archive ignores
them. When db_recover is run, any files referenced in the log that are not present during recovery are assumed to have been deleted and
are not be recovered. Run in verbose mode, listing the checkpoints in the log files as they are reviewed.
DESCRIPTION
A customized version of the Berkeley Database (Berkeley DB) is embedded in the operating system to provide high-performance database sup-
port for critical security files. The DB includes full transactional support and database recovery, using write-ahead logging and check-
pointing to record changes.
The db_archive utility is provided for maintenance of the log files associated with the security database. It writes the pathnames of log
files that are no longer in use (that is, no longer involved in active transactions), to the standard output, one pathname per line. These
log files should be written to backup media to provide for recovery in the case of catastrophic failure (which also requires a snapshot of
the database files), but they may then be deleted from the system to reclaim disk space. You should perform a db_checkpoint -1 before
using db_archive.
The secconfig utility can create a cron job that periodically checks the security log files and deletes those no longer in use, as deter-
mined by db_archive. Be sure to coordinate this with the site backup schedule.
The db_archive utility attaches to one or more of the Berkeley DB shared memory regions. In order to avoid region corruption, it should
always be given the chance to detach and exit gracefully. To cause db_archive to clean up after itself and exit, send it an interrupt sig-
nal (SIGINT).
RETURN VALUES
The db_archive utility exits 0 on success, and >0 if an error occurs.
ENVIRONMENT VARIABLES
If the -h option is not specified and the environment variable DB_HOME is set, it is used as the path of the database home. The home
directory for security is /var/tcb/files.
FILES
/var/tcb/files/auth.db
/var/tcb/files/dblogs/*
RELATED INFORMATION
Commands: db_checkpoint(8), db_dump(8), db_load(8), db_printlog(8), db_recover(8), db_stat(8), secconfig(8) delim off
db_archive(8)