How to compress & delete files in ksh


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting How to compress & delete files in ksh
# 1  
Old 01-21-2007
How to compress & delete files in ksh

Hi, I would like to know how to go about writing a script to compress & deleate old files from /var/mqm/log file system.

I am a complete beginner and would love it if someone could actually give me the code for this.

Thank you.
# 2  
Old 01-21-2007
You want to compress & delete the same files? Then why compress them? Why not just delete them? Also, how old is "old"? Can you be more specific?
# 3  
Old 01-21-2007
#!/usr/bin/ksh

LOGDIR=/var/mqm/log
COMPRESSDAYS=3
DELETEDAYS=5

find ${LOGDIR} ! -name \*.gz -mtime +{COMPRESSDAYS} -exec gzip {} \;
find ${LOGDIR} -name \*.gz -mtime +{DELETEDAYS} -exec rm {} \;


This script would compress all files in your log dir which are older than 3 days and delete all compressed files which are older than 5 days.

However, many Unix systems have a standard facility to arrange something like this for you, like "logadm" on Solaris systems.
# 4  
Old 01-22-2007
about file system getting full

Please suggest me what to do? I have to compress the log file and move it and then later delete it. Never worked on it please suggest the best way to do it. I have to run this job on corntab.

Some one wrote this for me please explain How this exactly works? Any help is appreciated.

#!/bin/ksh
DATE=`date +%Y%m%d%H%M`
FILES=`find /soa/p2/syst/var/log -name XMLTTrace.log`
for logfile in $FILES
do
newlogfile=$logfile.$DATE
CMD="cp $logfile $newlogfile"
$CMD
cat /dev/null > $logfile
rotated_logs="$rotated_logs $newlogfile"
done
for newlogfile in $rotated_logs
do
echo $newlogfile
echo "look in /soa/p2/syst/var/log/XMLTLogs for backed up log files"
echo "files gzipped. files deleted after one week"
/usr/bin/gzip -9 $newlogfile
mv ${newlogfile}.gz /soa/p2/syst/var/log/XMLTLogs
find /soa/p2/syst/var/log/XMLTLogs -mtime +7 | xargs rm
done
# 5  
Old 01-22-2007
Apparently the directory "/soa/p2/syst/var/log" contains subdirectories.

In each of these subdirectories the file "XMLTTrace.log" can be present.

All the files named "XMLTTrace.log" located in the directory "/soa/p2/syst/var/log" or it's subdirectories are located using the find command.

For all those files found, the file is copied to a new name with a time stamp appended to it using the "cp" command.

Next the original file is truncated using the "cat" command


All files which are found are also stored into the variable "rotated_logs"


#!/bin/ksh
DATE=`date +%Y%m%d%H%M`
FILES=`find /soa/p2/syst/var/log -name XMLTTrace.log`
for logfile in $FILES
do
newlogfile=$logfile.$DATE
CMD="cp $logfile $newlogfile"
$CMD
cat /dev/null > $logfile
rotated_logs="$rotated_logs $newlogfile"
done

All file stored in the variable "rotated_logs" are next compressed and moved to a different directory; "/soa/p2/syst/var/log/XMLTLogs"

This directory is also searched for files older than 7 days, and those are removed.

for newlogfile in $rotated_logs
do
echo $newlogfile
echo "look in /soa/p2/syst/var/log/XMLTLogs for backed up log files"
echo "files gzipped. files deleted after one week"
/usr/bin/gzip -9 $newlogfile
mv ${newlogfile}.gz /soa/p2/syst/var/log/XMLTLogs
find /soa/p2/syst/var/log/XMLTLogs -mtime +7 | xargs rm
done

All in all, this is a very inefficient scripts. Just look at the last "find" command which is inside the "for ... do ... done" loop, meaning the same action is performed several times where only the first time the find runs can really do something.


All could be done in a single "loop".


#!/bin/ksh
DATE=`date +%Y%m%d%H%M`
find /soa/p2/syst/var/log -name XMLTTrace.log | \
while read FILE
do
mv ${FILE} ${FILE}.${DATE}
touch ${FILE}
gzip -9 ${FILE}.${DATE}
mv ${FILE}.${DATE}.gz /soa/p2/syst/var/log/XMLTLogs
done
find /soa/p2/syst/var/log/XMLTLogs -type f -mtime +7 -exec rm {} \;

This is still not the way I would do it, however it's a more efficient version of your script. The only "but" here is that the newly created log file could have different ownerships/permission as the original one because instead of being truncated it is newly created. This could solved by placing a "chown" and "chmod" command between the "touch" and "gzip" lines.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Delete all files of a particular name & extension except one file.

I wish to delete all files that starts with "body<any number of digits>.xml" except body65.xml on Linux 7 bash shell So, from the below files body64.xml body.sh body65.xml body655.xml body565.xml body66.xml hello65.xml My command should delete all files except the below. body.sh... (2 Replies)
Discussion started by: mohtashims
2 Replies

2. Shell Programming and Scripting

Script to compress and delete the older logs

Hello Guys, Can you please help me with a script which zips the older log files(1-2 weeks) and delete them? I want to run the script manually instead of setting it up in a cron job. Appreciate your help. Regards, Kris (6 Replies)
Discussion started by: kriss.gv
6 Replies

3. Shell Programming and Scripting

SFTP Shell Script Get & Delete && Upload & Delete

Hi All, Do you have any sample script, - auto get file from SFTP remote server and delete file in remove server after downloaded. - only download specify filename - auto upload file from local to SFTP remote server and delete local folder file after uploaded - only upload specify filename ... (3 Replies)
Discussion started by: weesiong
3 Replies

4. Shell Programming and Scripting

Script needed to delete to the list of files in a directory based on last created & delete them

Hi My directory structure is as below. dir1, dir2, dir3 I have the list of files to be deleted in the below path as below. /staging/retain_for_2years/Cleanup/log $ ls -lrt total 0 drwxr-xr-x 2 nobody nobody 256 Mar 01 16:15 01-MAR-2015_SPDBS2 drwxr-xr-x 2 root ... (2 Replies)
Discussion started by: prasadn
2 Replies

5. Shell Programming and Scripting

Compress and delete folders

Hello, can someone help me to create a short script that tar.gz all folders form a specific folder and the delete the folders themself (without deleting the tar.gz created files)? Moreover I'd like to add it on crontab and run it each day. For example: in /tmp I have: /tmp/test/test1... (7 Replies)
Discussion started by: mab80
7 Replies

6. Shell Programming and Scripting

[Solved] Get files & delete them by shell script

I want to use my script to get any file then delete it once it transfers to my side , I manage to create below script to generate "list" file which contains all file names in "10.10.1.1" then I made "a.out" file which contains the commands that I want to run it on "10.10.1.1" to get & delete the... (2 Replies)
Discussion started by: arm
2 Replies

7. Red Hat

Need Script to ZIP/SAVE & then DELETE Log file & DELETE ZIPS older than 12 months

ENVIROMENT Linux: Fedora Core release 1 (Yarrow) iPlanet: iPlanet-WebServer-Enterprise/6.0SP1 Log Path: /usr/iplanet/servers/https-company/logs I have iPlanet log rotation enabled rotating files on a daily basis. The rotated logs are NOT compressed & are taking up too much space. I... (7 Replies)
Discussion started by: zachs
7 Replies

8. Web Development

Compress then Delete

Hi all, Someone please help with a script that will compress a directory for backed up: home/main/directory2Bcompressed/ home/main/directory2Bcompressed_date.zip Then all the files in the directory are to be deleted right afterwards, emptied out for new files to come in: ... (6 Replies)
Discussion started by: MrDude
6 Replies

9. Shell Programming and Scripting

Shell script delete log files from folder & subfolders on space usage

Hi, I am trying to write a shell script to delete logs generate by db when space in the folder reaches 70%. i am getting space values from db, find the files at OS and remove them by using a cron job runs every 5minutes. I have to keep the latest 5 files at any time, my problem is that log files... (3 Replies)
Discussion started by: saha
3 Replies

10. SCO

compress & cpio commands

Our End of Day backup routine uses following script. start End-of-day compress $BASE TO /home/compdir write /home/compdir to DATTAPE end where $BASE=/home2/Rev83 DATATAPE=/dev/rmt/ctape1 write=cpio (not sure about parameters) since I'm new to UNIX, i dont know how to restore data... (1 Reply)
Discussion started by: tayyabq8
1 Replies
Login or Register to Ask a Question