Sponsored Content
Top Forums Shell Programming and Scripting Reading/Compressing of log file Post 302345097 by chompy on Tuesday 18th of August 2009 11:51:27 AM
Old 08-18-2009
Depending on the nature of these logfiles, and your craftiness, you might be able to use a tool like logwatch to do the parsing for you.

20-30gb log file?!?!?! What happened to syslog?
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Log File Writing and Reading

Hi all, I have the following shell script code which tries to sftp and writes the log into the log file. TestConnection () { echo 'Connection to ' $DESTUSERNAME@$DESTHOSTNAME $SETDEBUG if ]; then rm $SCRIPT ; fi touch $SCRIPT echo "cd" $REMOTEDIR >> $SCRIPT echo "quit" >>... (10 Replies)
Discussion started by: valluvan
10 Replies

2. UNIX for Dummies Questions & Answers

Compressing of log files

Hello All My first post in the forum. :) I've this huge log files of size 20GB-30 GB in my unix server. I want to analyse the log file for some error messages. But because of the enormity of the size of these files i'm not able to grep/search the pattern in the file . Also, tried to gzip the... (1 Reply)
Discussion started by: sgbhat
1 Replies

3. Shell Programming and Scripting

Searching for Log / Bad file and Reading and writing to a flat file

Need to develop a unix shell script for the below requirement and I need your assistance: 1) search for file.log and file.bad file in a directory and read them 2) pull out "Load_Start_Time", "Data_File_Name", "Error_Type" from log file 4) concatinate each row from bad file as... (3 Replies)
Discussion started by: mlpathir
3 Replies

4. Shell Programming and Scripting

Is there any way to find the compressed size of a file without compressing it in linux

i need to backup a directory from one partition to another and and compress that directory after backing up, so i need to predict the compressed size of the directory with out actually compressing it, to check whether the space is available in the destination partition to accommodate the zipped... (2 Replies)
Discussion started by: Kesavan
2 Replies

5. Shell Programming and Scripting

Compressing previous log automatically

I want to create a script that will zip the previous log. Example. abc.log.2012.12.02 abc.log.2012.12.01.gzip abc.log If today is 2012.12.03 , my current log is abc.log and my previous date is 2012.12.02, i want abc.log.2012.12.02 to compress everytime I run the script. I can... (5 Replies)
Discussion started by: kaibiganmi
5 Replies

6. Shell Programming and Scripting

Changing the file name while compressing

Hi, Is there any way to change the file name while compressing ? using Compress and gzip and tar ? Say, I have a file foo.txt - so I have to compress this file and the resultant file name is foo.txt_20130113.gz or foo.txt_20130113.Z This to be done while performing the compression... (2 Replies)
Discussion started by: karumudi7
2 Replies

7. Shell Programming and Scripting

Compressing old files as zip file through script

I have below files in foler one/archive> one. txt 6/21/2013 two txt 7/23/2013 three.txt 6/20/2013 I wanted to move all the old files (>30 days) compressing single .zip file into one/archive/ as below two txt 7/23/2013 oldfiles.zip 6/21/2013 Please provide... (6 Replies)
Discussion started by: Ganesh L
6 Replies

8. Shell Programming and Scripting

Reading line by line from live log file using while loop and considering only those lines start from

Hi, I want to read a live log file line by line and considering those line which start from time stamp; Below code I am using, which read line but throws an exception when comparing line that does not contain error code tail -F /logs/COMMON-ERROR.log | while read myline; do... (2 Replies)
Discussion started by: ketanraut
2 Replies

9. Shell Programming and Scripting

Reading a log file

Hi, I'm trying to write a script to go through few folders read some log files and make a list from the data. the log files are in few steps each step having a final energy but I need to read the last final energy coming after the keyword "hurray". I have so far accomplished a code like... (16 Replies)
Discussion started by: raymondg
16 Replies

10. Homework & Coursework Questions

Trouble with Shell Script Compressing file

Use and complete the template provided. The entire template must be completed. If you don't, your post may be deleted! 1. The problem statement, all variables and given/known data: You will create a shell script that performs the following action: Use a parameter to pass a file that... (5 Replies)
Discussion started by: Luvs2drnk
5 Replies
TRIMHISTORY(8)						      System Manager's Manual						    TRIMHISTORY(8)

NAME
trimhistory - Remove old Xymon history-log entries SYNOPSIS
trimhistory --cutoff=TIME [options] DESCRIPTION
The trimhistory tool is used to purge old entries from the Xymon history logs. These logfiles accumulate information about all status changes that have occurred for any given service, host, or the entire Xymon system, and is used to generate the event- and history-log web- pages. Purging old entries can be done while Xymon is running, since the tool takes care not to commit updates to a file if it changes mid-way through the operation. In that case, the update is aborted and the existing logfile is left untouched. Optionally, this tool will also remove logfiles from hosts that are no longer defined in the Xymon bb-hosts(5) file. As an extension, even logfiles from services can be removed, if the service no longer has a valid status-report logged in the current Xymon status. OPTIONS
--cutoff=TIME This defines the cutoff-time when processing the history logs. Entries dated before this time are discarded. TIME is specified as the number of seconds since the beginning of the Epoch. This is easily generated by the GNU date(1) utility, e.g. the following com- mand will trim history logs of all entries prior to Oct. 1st 2004: trimhistory --cutoff=`date +%s --date="1 Oct 2004"` --outdir=DIRECTORY Normally, files in the BBHIST directory are replaced. This option causes trimhistory to save the shortened history logfiles to another directory, so you can verify that the operation works as intended. The output directory must exist. --drop Causes trimhistory to delete files from hosts that are not listed in the bb-hosts(5) file. --dropsvcs Causes trimhistory to delete files from services that are not currently tracked by Xymon. Normally these files would be left untouched if only the host exists. --droplogs Process the BBHISTLOGS directory also, and delete status-logs from events prior to the cut-off time. Note that this can dramatically increase the processing time, since there are often lots and lots of files to process. --progress[=N] This will cause trimhistory to output a status line for every N history logs or status-log collections it processes, to indicate how far it has progressed. The default setting for N is 100. --env=FILENAME Loads the environment from FILENAME before executing trimhistory. --debug Enable debugging output. FILES
$BBHIST/allevents The eventlog of all events that have happened in Xymon. $BBHIST/HOSTNAME The per-host eventlogs. $BBHIST/HOSTNAME.SERVICE The per-service eventlogs. $BBHISTLOGS/*/* The historical status-logs. ENVIRONMENT VARIABLES
BBHIST The directory holding all history logs. BBHISTLOGS The top-level directory for the historical status-log collections. BBHOSTS The location of the bb-hosts file, holding the list of currently known hosts in Xymon. SEE ALSO
xymon(7), bb-hosts(5) Xymon Version 4.2.3: 4 Feb 2009 TRIMHISTORY(8)
All times are GMT -4. The time now is 05:06 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy