Advice on monitoring gziped files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Advice on monitoring gziped files
# 1  
Old 05-24-2013
Advice on monitoring gziped files

so, every 5 minutes, i monitor some data files based on their line numbers.

so if i checked now and there are 100 lines in the file, i will scan the 100 lines for specific errors.

if i check again 5 minutes later and there are 150 lines in the file, i will monitor the file from line 100 to 150. since only 50 lines were added since i last checked.

now, lets say the file being monitored is called dataccess.log.

when this file gets rotated, it is named dataccess.log.gz.

and then there's a new file called dataccess.log

i know there are commands such as zgrep, zcat and these other types on linux. but these commands dont exist on other UNIX OSes, i.e. solaris.

so my question is, how would you guys scan the "gzipped" version of the file to make sure you didn't miss anything? then scan the dataccess.log as usual. keep in mind i already have the line numbers on record. so maybe that could be of use to you?
# 2  
Old 05-24-2013
If you look inside gzcat on many systems, you'll find that it's a shell script containing this:

Code:
gunzip -cd "$@"

-c for "print to stdout", -d for "decompress".

so

Code:
( gunzip -cd /path/to/file.log.gz ; cat /path/to/logfile ) | grep ...

This User Gave Thanks to Corona688 For This Post:
# 3  
Old 05-24-2013
Quote:
Originally Posted by Corona688
If you look inside gzcat on many systems, you'll find that it's a shell script containing this:

Code:
gunzip -cd "$@"

-c for "print to stdout", -d for "decompress".

so

Code:
( gunzip -cd /path/to/file.log.gz ; cat /path/to/logfile ) | grep ...

thank you so much!!! had no idea it could be done this way.

the only problem i have is. i dont know how big the gzip log file could be. so "cat"ing it can be disastrous if the gzip file is huge.

the only alternative is to write the uncompressed output to disk but that's a bad idea.


if there's absolutely no other solution, would the following be the most efficient way to go about what i'm trying to do?:

Code:
( gunzip -cd /path/to/file.log.gz ; cat /path/to/logfile | awk 'NR>100' ) | grep FAIL

also, i worry that the command could/would overwrite the new copy (dataccess.log) that was created when the data file was rotated.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Realtime monitoring of files in a directory

Hello Experts, Here are my requirements. I want to monitor a directory and retrieve file properties like file names along with file sizes of files being generated today and that exceeds a threshold limit (say 10MB) using a shell script. Basically this shell script should monitor (kinda... (1 Reply)
Discussion started by: RamanM
1 Replies

2. Red Hat

Advice regarding filesystems handling large number of files

Hi All, I have a CentOS operating system installed. I work with really huge number of files which are not only huge in number but some of them really huge in size. Minimum number of files could be 1 million to 2 million in one directory itself. Some of the files are even several Gigabytes in... (2 Replies)
Discussion started by: shoaibjameel123
2 Replies

3. SCO

Need advice: Copying large CSV report files off SCO system

I have a SCO Unix server from 1999 running SCO 5.0.5 and some ancient accounting software called Real World A report writer program on the system is used to generate CSV files from accounting that we write with DOSCOPY commands to 3.5" floppies In the next 60 days we will be decommissioning... (11 Replies)
Discussion started by: magnetman
11 Replies

4. Shell Programming and Scripting

password for compressed/gziped

Dear All, I have script which generates the reports in csv format and compress/gzip these files. I need to put password for these compressed/gziped files and these can be opened on windows using same password. is it possible to do the same.... (1 Reply)
Discussion started by: arvindng
1 Replies

5. Solaris

Could not able to delete the files through find command need expert advice

Hi Gurus I am facing a problem, there is a folder called /a where there are lots of files which are occupying space anything between 30 GB to 100 GB as I am not able to check the space occupied by that folder through "du -sh /a" command as I don't see any output after more than 1 hour of... (4 Replies)
Discussion started by: amity
4 Replies

6. UNIX for Dummies Questions & Answers

Monitoring specific files and folders

I want a mechanism to monitor a folder full of files that are sensitive. I want to log all accesses,modifications and changes to any file within the folder in a log file which should give me access/modify/change times,the user id of the process which tried and the pid. Even some idea of what to... (1 Reply)
Discussion started by: Vivek788
1 Replies

7. Shell Programming and Scripting

advice on managing files between two systems

I am looking for some guidance to see if this is the most efficient solution. I have a piece of software that updates a directory every night at 2am. The directory contains xml files so at 2am any new files are written, any modified files are modified and any deleted files are removed from the... (1 Reply)
Discussion started by: yabai
1 Replies

8. UNIX for Dummies Questions & Answers

Monitoring & emailing log files

Hi ..first post ! I have a Unix v445 using solaris 10. I've trolled through various web pages but can't find exactly what I'm looking for. I have an alert log...or any messages file for that matter I need to check for certain key (error type) phrases - if I find them, they are redirected to... (11 Replies)
Discussion started by: davidra
11 Replies

9. Shell Programming and Scripting

Monitoring a directory for new files with .xx and executing command if found

Hi Guys. I am a complete shell scripting newbie with some syntax and commands understanding. I'm more of a win admin. With that said: I need to write a shell script to monitor a directory '/Mon_Dir' for new occurrences of files with .xx extension. Once a new file is detected in the directory, a... (4 Replies)
Discussion started by: krkan
4 Replies

10. Programming

monitoring files copied onto hard disk

hi... i need pointers to books/website... 'm trytin to write a daemon that monitors files of particular type(eg. text or pdfs) copied onto the hard disk. the daemon should detect the above n write the file name (along with the absolute path) to a file. please DO NOT give me the code... (2 Replies)
Discussion started by: abhi_abhijith
2 Replies
Login or Register to Ask a Question