Archiving or removing few data from log file in real time
Hi,
I have a log file that gets updated every second. Currently the size has grown to 20+ GB. I need to have a command/script, that will try to get the actual size of the file and will remove 50% of the data that are in the log file. I don't mind removing the data as the size has grown to huge size. Please advise as this is bit urgent related to space in the server.
May be hard to execute any kind of cleanup with new data added every second.
One theory for cleanup...
determine the line count, assuming each update is on its own line
divide that number in half
use a tail command to copy the 2nd half of the list to a new file
then copy it back to the original filename
It seems cat will take long time to cat the file as it is huge in size now. var=`expr $(cat filename| wc -l) / 2` is taking long time to execute. I am waiting though
Last edited by Franklin52; 06-27-2014 at 11:26 AM..
Reason: Please use code tags
Deleting all the data would be easy, but half? Hmm.
What is making this logfile? Many daemons allow you to send a signal to them when you want to change the logfile, which would at least let you deal with the file without it stomping it several times a second.
Surely cat filename|wc -l adds a process and therefore considerable extra time. Would wc -l filename not be quicker?
Anyhow, is the file being appended to as in-use all the time or is it separate operations. Consider these two (probably not exactly true, but just for an example)
versus
In the first, you have five discreet "open-append and close" operations. In the second you have one, so in the gaps between the echo statements, the file remains open. If you delete the data and write the file back, where does the subsequent output go? If you rename the file, then the output follows the old file.
Like Corona688 says, we need to know what is generating the messages. It may be that you have to stop that process whilst you manipulate the file, then restart it if there is no signal you can send to get it to switch logs.
Below is my script to log all the command input by any user to /var/log/messages. But I cant achieve the desired output that i want. PLease see below.
function log2syslog
{
declare COMMAND
COMMAND=$(fc -ln -0)
logger -p local1.notice -t bash -i -- "$USER:$COMMAND"
}
trap... (12 Replies)
Hello All,
I am building a real time parser for a log file in my application.
The log file is continuously written at a very fast pace and gets rolled over every 10 minutes.
I have measured the speed and observed that around 1000 lines are written to it every second, each line about 30-40... (7 Replies)
Hi people
I have a bash script with a line like this:
python example.py >> log &
But i can't see anything in the log file while python program is running only if the program ends seems to write the log file.
"$ cat log" for example don't show anything until the program ends.
Is there... (4 Replies)
Hey all, I have a problem I was hoping to get some help on. So I have my two auditfiles, audfile1 and audfile2 that can be written to, I want to have the text version of them write to an NFS mount that I have set up. So i already know that i can do .secure/etc/audsp audfile1 > //nfsmount/folder/... (5 Replies)
Hi all,
i would like to write the shell script program, it can monitor the access_log "real time"
when the access_log writing the line contain "abcdef" the program will be "COPY" this line into a file named "abcdef.txt", do the same thing if the contain "123456" "COPY" it into a file named... (3 Replies)
When I run "/etc/myApp" I am presented with continuous output, just about once per second.
However when I try to get the information in Perl via a piped open, it waits till the end to give me anything... my code:
open (OUTPUT,"/etc/myApp |");
while (<OUTPUT>){
print $_;
}... (2 Replies)