Trimming files concurrently


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Trimming files concurrently
# 1  
Old 12-22-2007
Trimming files concurrently

I have a file which is written by an ongoing process (actually it is a logfile). I want to trim this logfile with a script and save the trimmed portion to another file or display it to <stdout> (it gets picked up there by another process).

Fortunately my logfile has a known format (XML) and i can find the end of a logical paragraph (a certain closing tag) so i know up to which line i have to trim to. The problem i am facing is that the logfile is written more or less permanently (at the rate of ~10k lines per day) and i want to reduce the portion which might eventually be lost during the trimming to the absolute minimum. I am well aware that i cannot achieve the optimum of losing no output at all without job control (which i don't have) but i want to get as close as it is possible. This is what i have come up so far ("</af>" is the closing tag i am anchoring at):

Code:
                                                 # get nr of lines in log
iNumLines=$(wc -l $fLog | sed 's/^ *//' | cut -d' ' -f1)

chActLine=""
(( iNumLines += 1 ))
while [ "$chActLine" != "</af>" ] ; do
     (( iNumLines -= 1 ))
     chActLine="$(sed -n "${iNumLines}p" $fLog)"
done

sed -n "1,${iNumLines}p" $fLog                   # output to <stdout>
sed "1,${iNumLines}d" $fLog > $chTmpDir/log      # remove printed lines
cat $chTmpDir/log > $fLog                        # overwrite with shortened
                                                 # version

While this is generally doing what i want i'd like to ask if there might be a way to further reduce the risk of lost lines as i perceive to be there between line 11 and 12 of the script snippet.

Any suggestions will be welcome.

bakunin
# 2  
Old 12-23-2007
Hi, bakunin.

I don't know how fast your machine is, but you are dealing with several processes here, so those will definitely take up some time. If I understand this, we want to minimize real time to avoid loss of data. You didn't mention how large the file was. If it's really large, this might not work.

Memory is obviously faster than disk, so I suggest creating a perl script to slurp in the file, perhaps have several subroutines (if you like modularity) to take the place of the seds, etc., and write out the results. Even if you copy the perl lists a few times internally, that's still a real-time savings over disk access.

Even if you did this and it turned out not to be the final answer, there might be some parts of the perl script that would be useful, and just doing the script might suggest other alternatives ... cheers, drl
# 3  
Old 12-25-2007
Quote:
Originally Posted by drl
Hi, bakunin.

I don't know how fast your machine is, but you are dealing with several processes here, so those will definitely take up some time. If I understand this, we want to minimize real time to avoid loss of data. You didn't mention how large the file was. If it's really large, this might not work.
The machine is a LPAR in a IBM p570 with 2 physical CPUs (4 LCPUs). The quantity structure is as follows:

~10k lines per day
~4MB per day

The file is the garbage collector log of a JVM (the machine is running some Websphere 6.1 application servers) and the logfile is in XML format. That means, the lines are not written in constant intervals, but always a bunch of lines (one "paragraph", so to say) at a time. The information units i want to separate. are each starting with a "<af>" tag and ending with with a "</af>" tag.

Quote:
Memory is obviously faster than disk, so I suggest creating a perl script to slurp in the file, perhaps have several subroutines (if you like modularity) to take the place of the seds, etc., and write out the results. Even if you copy the perl lists a few times internally, that's still a real-time savings over disk access.
Not at all! perl is definitely way slower than sed, at about a factor 10. I came to this conclusion when working on my last project, where i had to change database dumps (frighteningly huge files) and replaced the perl programs doing it with sed - that sped up the process greatly.

As i see it the critical part is only between lines 11 and 12 of the code snippet. All the previous operations are working from line 1 up to some predetermined line x of the file and it won't hurt of there come additional lines in during this time.

As an additional requirement i have to preserve the inode of the file, because the process which writes to it (the garbage collector of the JVM) will continue to write into it. This is why i used "cat > ..." instead of "mv ...".

bakunin
# 4  
Old 12-26-2007
Interesting problem. I think it will be quite difficult to handle additional lines in such a fast updating file using sed - after all, sed effectively stores the lines in your file in a buffer and operates on this using pattern & hold space. How can it keep track of new stuff coming in?

You will have to work with something which can seek to the point till which you want to archive & remove that part - DIRECTLY on the ever-changing logfile.

What you can do is reduce the window between these two:

Code:
sed -n "1,${iNumLines}p" $fLog                   # output to <stdout>
sed "1,${iNumLines}d" $fLog > $chTmpDir/log      # remove printed lines

And do it in one shot:

Code:
$ cat data
     1  file:10:no:1011
     2  file:10:file:1011
     3  data:10:say:1011
     4  data:10:data:1011
     5  file:10:file:1011
     6  file:10:file:1011
     7  file:10:file:1011
     8  file:10:file:1011
     9  file:10:file:1011
    10  file:10:file:1011
    11  file:10:file:1011
    12  file:10:file:1011
    13  file:10:file:1011
    14  file:10:file:1011
    15  file:10:file:1011
    16  file:10:file:1011
    17  file:10:file:1011
    18  file:10:file:1011
    19  file:10:file:1011
    20  file:10:file:1011
    21  data:10:say:1011
$ cat sedscr
#!/usr/bin/ksh

iNumLines=10
sed -n "
        # Get the lines to be archived and put on stdout
        1,$iNumLines p
        # Write the rest of (trimmed) data to temporary file. This file can be used to overwrite data.
        $((iNumLines+1)),\$ w data.trimmed
" data
$ sedscr
     1  file:10:no:1011
     2  file:10:file:1011
     3  data:10:say:1011
     4  data:10:data:1011
     5  file:10:file:1011
     6  file:10:file:1011
     7  file:10:file:1011
     8  file:10:file:1011
     9  file:10:file:1011
    10  file:10:file:1011
$ cat data.trimmed
    11  file:10:file:1011
    12  file:10:file:1011
    13  file:10:file:1011
    14  file:10:file:1011
    15  file:10:file:1011
    16  file:10:file:1011
    17  file:10:file:1011
    18  file:10:file:1011
    19  file:10:file:1011
    20  file:10:file:1011
    21  data:10:say:1011

I toyed with the idea of writing directly to data instead of data.trimmed but that obviously doesn't help since the additional lines which came in after sed loaded the file in its buffer will be lost. Basically you can't use sed, ed etc. which operate on a "copy" of the file.

HTH
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Trimming in between the words

Hi i have a log file P12345_15728710:DEBUG:Begin P12345_15728710:DEBUG:Being P12345_15729310:DEBUG:GetAgen P12345_15726510:DEBUG:end i want to trim this file and i want like this 15728710 15728710 15729310 15726510 i tried sed ..but not working.. sed "s/.*P12345__ \(.*\)... (4 Replies)
Discussion started by: navsan420
4 Replies

2. Shell Programming and Scripting

script for Trimming the log files

Dear Gurus, Trimming the alert log files of oracle. I need to write a script retaining last 20 lines of the log files and deleting all the other lines I tried with tail -20 , Please help me how to delete all the other lines, my log files has grown upt to 2G. Thanks in Advance Dave (5 Replies)
Discussion started by: dave234
5 Replies

3. UNIX for Dummies Questions & Answers

How to run script concurrently

Hi all, I have one script. Its job is to get 1 file from dirA and 1 file from dirB as parameters for another python script. However, there are a lot of files in both dir A and B, so running this scripts really takes a lot of time. So I wonder if there are any ways that I can do this faster,... (6 Replies)
Discussion started by: yoyomano
6 Replies

4. AIX

Restore and upgrade concurrently

I have serveral servers that are at AIX 6.1 tl4 sp1 and want to move them to new hardware and upgrade them at the same time. Using NIM and sysback images I want to backup the current server with sysback and restore it and upgrade it to AIX 6.1 tl4 sp6 to the new hardware using my NIM server. My... (4 Replies)
Discussion started by: daveisme
4 Replies

5. Shell Programming and Scripting

Running function or command concurrently in a bash script

I currently run a script over a vpnc tunnel to back-up my data to a remote server. However for a number of reasons the tunnel often collapses. If I manually restore the tunnel then the whole thing can continue, but I want to add into my script a section whereby while the transfer is taking place,... (8 Replies)
Discussion started by: dj_bridges
8 Replies

6. UNIX for Advanced & Expert Users

Trimming the spaces

Hi, How can I remove the unwanted spaces in the line. 123456 789 ABC DEF. - I wanna remove the sapces in this line, I need the output 123456789ABCDEF. Pls help me...... (3 Replies)
Discussion started by: sharif
3 Replies

7. Shell Programming and Scripting

Running same script multiple times concurrently...

Hi, I was hoping someone would be able to help me out. I've got a Python script that I need to run 60 times concurrently (with the number added as an argument each time) via nightly cron. I figured that this would work: 30 1 * * * for i in $(seq 0 59); do $i \&; done However, it seems to... (4 Replies)
Discussion started by: ckhowe
4 Replies

8. Programming

how to run socket programme and file management concurrently

hi i have a server socket programme which is running in HP/UX system and then need to add a function about refreshing memory every miniute because the socket programme is blocked , i have no idea about this what should i do thanks (10 Replies)
Discussion started by: benbenpig
10 Replies

9. Programming

Run 4-processes concurrently

How can i run a back ground process.... I.e for example by using fork() i need to run seperate 4 background processes.. How can send a process to background ??? (9 Replies)
Discussion started by: ugp
9 Replies

10. Shell Programming and Scripting

Executing multiple Oracle procedures concurrently

I am using KSH with an OS of AIX Version 5.3. From my shell script, that will be scheduled thorugh a CRON, I need to execute 2 Oracle stored procedures concurrently. If we execute them one at a time, the total execution time takes 4 hours or more. Since they are not dependent on each other and... (6 Replies)
Discussion started by: multidogzoomom
6 Replies
Login or Register to Ask a Question