10-08-2012
Compressing previous log automatically
I want to create a script that will zip the previous log.
Example.
abc.log.2012.12.02
abc.log.2012.12.01.gzip
abc.log
If today is 2012.12.03 , my current log is abc.log and my previous date is 2012.12.02, i want abc.log.2012.12.02 to compress everytime I run the script.
I can manually run compress the file
gzip abc.log.2012.12.02 but I want the log to automate every 2am in the morning.
Every 1200midnight the file is rotating and it creates a previous date, and I want this previous date file to be compress.
I now how to use the cronjob but i dont have script to do it. Please help.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi,
I have problem of filtering a log file from my perl script.
#cat /data/pinpe.csv_20070731 | nawk -v FS=, '{print $1','$18','$22','$26}' | grep -w 100 | grep -w 1 | nawk '{print $4}'
Below is the output:
2009-06-16
2009-01-29
2009-06-02
2008-03-05
2007-08-05
2007-09-24... (5 Replies)
Discussion started by: pinpe
5 Replies
2. Shell Programming and Scripting
Logs are created on a daily basis, the format of the log file is: servername_yymmdd. I need to check if there is a difference in the log created today with the previous one. I would appreciate help on how I can step back to the previous log and how can I test if there is a diff in the logs.
... (4 Replies)
Discussion started by: gugs
4 Replies
3. Shell Programming and Scripting
Hi all,
I'm doing automation task for my team and I just started to learn unix scripting so please shed some light on how to do this:
1) I have 2 sets of datafiles - datafile A and B. These datafiles must be loaded subsequently and cannot be loaded concurrently.
2) So I loaded datafile A... (10 Replies)
Discussion started by: luna_soleil
10 Replies
4. Shell Programming and Scripting
Hi Experts,
I have a shell script called "updatevs" that is scheduled to run at 6.00 am everyday via cronjob. The cronjob will execute this script and output to a log file. The functionality of this script is to read the database and run a set of commands. This script is generally successful... (6 Replies)
Discussion started by: forumthreads
6 Replies
5. UNIX for Dummies Questions & Answers
Hello All
My first post in the forum. :)
I've this huge log files of size 20GB-30 GB in my unix server. I want to analyse the log file for some error messages. But because of the enormity of the size of these files i'm not able to grep/search the pattern in the file . Also, tried to gzip the... (1 Reply)
Discussion started by: sgbhat
1 Replies
6. Shell Programming and Scripting
Hello All
Posted a similar thread in some other section too. Regrets if this section is not suitable for this post. Request all the members to be tolerant as i'm a newbie here :)
Coming to the topic : I've this huge log files of size 20GB-30 GB in my unix server. I want to analyse the log... (2 Replies)
Discussion started by: sgbhat
2 Replies
7. Shell Programming and Scripting
Hi,
i want to check the log for previous date.
For getting today's log I have written the script
result=tot_max_`date+%y%m%d`.log
Pls can anyone help on how to get for previous date.
Thanks in Advance,
Neha. (5 Replies)
Discussion started by: NehaKrish
5 Replies
8. Shell Programming and Scripting
Here is the script I want to run to deleted log files after a certain time:
touch /usr/WebSphere/AppServer/profiles/AppSrv01/apps/RSA/logs
find /usr/WebSphere/AppServer/profiles/AppSrv01/apps/RSA/logs -atime +120 - exec rm -rf {}\;
Exerytime I run it, it throws me the error:
find: paths must... (2 Replies)
Discussion started by: lennyz04
2 Replies
9. Shell Programming and Scripting
I want to remove commands having no output. In below text file.
bash-3.2$ cat abc_do_it.txt
grpg10so>show trunk group all status
grpg11so>show trunk group all status
grpg12so>show trunk group all status
GCPKNYAIGT73IMO 1440 1345 0 0 94 0 0 INSERVICE 93% 0%... (4 Replies)
Discussion started by: Raza Ali
4 Replies
10. Shell Programming and Scripting
The `bash` below uses the oldest folder in the specified directory and logs it. The goes though an analysis process and creates a log. My problem is that if there are 3 folders in the directory folder1,folder2,folder3, the bash is using folder2 for the analysis eventhough folder1 is the oldest... (0 Replies)
Discussion started by: cmccabe
0 Replies
LEARN ABOUT OPENDARWIN
savelog
SAVELOG(8) System Manager's Manual SAVELOG(8)
NAME
savelog - save a log file
SYNOPSIS
savelog [-m mode] [-u user] [-g group] [-t] [-p] [-c cycle] [-l] [-j] [-J] [-1 .. -9] [-C] [-d] [-l] [-r rolldir] [-n] [-q] [-D dateformat]
file ...
DESCRIPTION
The savelog command saves and optionally compresses old copies of files. Older versions of file are named:
file.<number><compress_suffix>
where <number> is the version number, 0 being the newest. Version numbers > 0 are compressed unless -l prevents it. Version number 0 is
not compressed because a process might still have file opened for I/O. Only cycle versions of the file are kept.
If the file does not exist and -t was given, it will be created.
For files that do exist and have lengths greater than zero, the following actions are performed:
1) Version numbered files are cycled. Version file.2 is moved to version file.3, version file.1 is moved to version file.2, and so on.
Finally version file.0 is moved to version file.1, and version file is deleted. Both compressed names and uncompressed names are
cycled, regardless of -l. Missing version files are ignored.
2) The new file.1 is compressed unless the -l flag was given. It is changed subject to the -m, -u, and -g flags.
3) The main file is moved to file.0.
4) If the -m, -u, -g, -t, or -p flags are given, then an empty file is created subject to the given flags. With the -p flag, the file
is created with the same owner, group, and permissions as before.
5) The new file.0 is changed subject to the -m, -u, and -g flags.
OPTIONS
-m mode
chmod the log files to mode, implies -t
-u user
chown log files to user, implies -t
-g group
chgrp log files to group, implies -t
-c cycle
Save cycle versions of the logfile (default: 7). The cycle count must be at least 2.
-t touch new logfile into existence
-l don't compress any log files (default: do compress)
-p preserve owner, group, and permissions of logfile
-j compress with bzip2 instead of gzip
-J compress with xz instead of gzip
For xz no strength option is set, and xz decides on the default based on the total amount of physical RAM. Note that xz can use a
very large amount of memory for the higher compression levels.
-1 .. -9
compression strength or memory usage (default: 9, except for xz)
-C force cleanup of cycled logfiles
-d use standard date for rolling
-D dateformat
override date format, in the syntax understood by the date(1) command
-r use rolldir instead of . to roll files
-n do not rotate empty files
-q be quiet
BUGS
If a process is still writing to file.0, and savelog moves it to file.1 and compresses it, data could be lost.
SEE ALSO
logrotate(8)
Debian 30 Dec 2017 SAVELOG(8)