Help to improve speed of text processing script


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Help to improve speed of text processing script
# 1  
Old 07-27-2009
Quote:
You know the movie matrix?
Since you have remainded me of the movie 'matrix' - am all charged up to answer your question to an extent atleast Smilie

Code:
${#lines[@]}

If this is not modified, try assigning it to a variable and reuse that, instead of computing it each time.

Code:
echo "${lines[${i}]}" >> test_$count.txt
done
echo "" >> test_$count.txt

Writing to a file whilst in the loop block will greatly reduce the performance of the script, what happens for every write call is ...
Code:
open file
write data
close file

Ideally what should be done is

Code:
open file
write data
write data
.
.
.
close file

instead write it to the outer block something like
Code:
while [ condition ]
do
# check and do some processing
done > $output_file

In this method, there will be n + 2 (approx) calls to file libs for 'n' units of write

instead of

Code:
while [ condition ]
do
# check and do some processing
# write to a file  > $output_file
done

In this method, there will be 3n calls for 'n' units of write, which will scale badly as 'n' progresses ...
# 2  
Old 07-27-2009
lorus,

why are we trying to write a split script..!? let split command do the job...

Code:
split -db 1m InFile OutFile

o/p: it creates files like
OutFile00
OutFile01
...
# 3  
Old 07-27-2009
Have you considered csplit. Assume the average size of your block is 300 bytes 1024000/300 = 3413 block for 1MB
Code:
csplit -k myinputfilename  '/^#Game/-1{3413}'

# 4  
Old 07-27-2009
Quote:
Originally Posted by jim mcnamara
Have you considered csplit. Assume the average size of your block is 300 bytes 1024000/300 = 3413 block for 1MB
Code:
csplit -k myinputfilename  '/^#Game/-1{3413}'

I forgot to say, that the length of each block is different

The posted one is just an example.
# 5  
Old 07-27-2009
Sorry, I could not dig much into your script.

So, I wrote a quick one for your case, where splitting as blocks is on the run scanning each lines without having to store any data or predetermine any stuff

I hope this should improve performance. If you have some time, please post some stats

Code:
#! /opt/third-party/bin/perl

use strict;
use warnings;

my ($input_file) = @ARGV;
open(my $lfh, '<', $input_file) or die "Unable to open file:$input_file <$!>\n";

my $start = 0;
my $curr_file_number = 1;
my $rfh;
my $data;
while ( $data = <$lfh> ) {
    print $rfh $data if ( $start == 1 );

    if ( $data =~ /#Game No :/ ) {
        my $running_file_name = "tyrant_" . $curr_file_number;
        open($rfh, '>', $running_file_name) or die "Unable to open file : $running_file_name <$!>\n";
        $start = 1;
        print $rfh $data;
        next;
    }
    if ( $data =~ / wins / ) {
        close($rfh) or die "Unable to close file <$!>\n";
        $start = 0;
        $curr_file_number++;
    }
}

close($lfh);

# 6  
Old 07-27-2009
Quote:
Originally Posted by matrixmadhan
Sorry, I could not dig much into your script.

So, I wrote a quick one for your case, where splitting as blocks is on the run scanning each lines without having to store any data or predetermine any stuff

I hope this should improve performance. If you have some time, please post some stats

Code:
#! /opt/third-party/bin/perl

use strict;
use warnings;

my ($input_file) = @ARGV;
open(my $lfh, '<', $input_file) or die "Unable to open file:$input_file <$!>\n";

my $start = 0;
my $curr_file_number = 1;
my $rfh;
my $data;
while ( $data = <$lfh> ) {
    print $rfh $data if ( $start == 1 );

    if ( $data =~ /#Game No :/ ) {
        my $running_file_name = "tyrant_" . $curr_file_number;
        open($rfh, '>', $running_file_name) or die "Unable to open file : $running_file_name <$!>\n";
        $start = 1;
        print $rfh $data;
        next;
    }
    if ( $data =~ / wins / ) {
        close($rfh) or die "Unable to close file <$!>\n";
        $start = 0;
        $curr_file_number++;
    }
}

close($lfh);

ah a perl script. I tried to avoid learning perl, but if it's significant faster then bash tools there is maybe no way around ?

Code:
# ./psplit.sh input.txt
Unable to close file <Bad file descriptor>

but it generates some files

Code:
# ls -l
total 24140
-rwxrwxrwx 1 root root 24608349 2009-07-26 16:30 input.txt
drwsrwsrwt 2 root root      101 2009-07-27 11:20 output
-rwxr-xr-x 1 root root      708 2009-07-27 13:29 psplit.sh
-rwxrwxrwx 1 root root      596 2009-07-27 11:11 split.sh
-rwxr-xr-x 1 root root      265 2009-07-27 11:34 split_test.sh
-rw-r--r-- 1 root root      683 2009-07-27 13:32 tyrant_1
-rw-r--r-- 1 root root      825 2009-07-27 13:32 tyrant_10
-rw-r--r-- 1 root root      609 2009-07-27 13:32 tyrant_11
-rw-r--r-- 1 root root      605 2009-07-27 13:32 tyrant_12
-rw-r--r-- 1 root root      777 2009-07-27 13:32 tyrant_13
-rw-r--r-- 1 root root     1001 2009-07-27 13:32 tyrant_14
-rw-r--r-- 1 root root      695 2009-07-27 13:32 tyrant_15
-rw-r--r-- 1 root root      747 2009-07-27 13:32 tyrant_16
-rw-r--r-- 1 root root      848 2009-07-27 13:32 tyrant_17
-rw-r--r-- 1 root root      631 2009-07-27 13:32 tyrant_18
-rw-r--r-- 1 root root      664 2009-07-27 13:32 tyrant_19
-rw-r--r-- 1 root root      767 2009-07-27 13:32 tyrant_2
-rw-r--r-- 1 root root      804 2009-07-27 13:32 tyrant_20
-rw-r--r-- 1 root root      655 2009-07-27 13:32 tyrant_21
-rw-r--r-- 1 root root      784 2009-07-27 13:32 tyrant_22
-rw-r--r-- 1 root root      628 2009-07-27 13:32 tyrant_23
-rw-r--r-- 1 root root     1040 2009-07-27 13:32 tyrant_24
-rw-r--r-- 1 root root      813 2009-07-27 13:32 tyrant_3
-rw-r--r-- 1 root root      810 2009-07-27 13:32 tyrant_4
-rw-r--r-- 1 root root      679 2009-07-27 13:32 tyrant_5
-rw-r--r-- 1 root root     1078 2009-07-27 13:32 tyrant_6
-rw-r--r-- 1 root root      949 2009-07-27 13:32 tyrant_7
-rw-r--r-- 1 root root      962 2009-07-27 13:32 tyrant_8
-rw-r--r-- 1 root root      810 2009-07-27 13:32 tyrant_9

each file has 1 block inside. What I want was to split the Input file in 1MB files which contains as much blocks as fit in 1MB.
# 7  
Old 07-27-2009
Try this and play around with the number (1000000) to get the desired size:

Code:
awk 'BEGIN{c=1}
/Hand History/{f=1;if(size>1000000){close("output_" c);size=0;c++}}
/Game #/{print "" > "output_" c;f=0}
f{print > "output_" c; size+=length}'  file

Regards

Last edited by Franklin52; 07-27-2009 at 10:02 AM.. Reason: adding close() command
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Solaris

Rsync quite slow (using very little cpu): how to improve its speed?

I have "inherited" a OmniOS (illumos based) server. I noticed rsync is significantly slower in respect to my reference, FreeBSD 12-CURRENT, running on exactly same hardware. Using same hardware, same command with same source and target disks, OmniOS r151026 gives: test@omniosce:~# time... (11 Replies)
Discussion started by: priyadarshan
11 Replies

2. Shell Programming and Scripting

Improve script

Gents, Is there the possibility to improve this script to be able to have same output information. I did this script, but I believe there is a very short code to get same output here my script awk -F, '{if($10>0 && $10<=15) print $6}' tmp1 | sort -k1n | awk '{a++} END { for (n in a )... (23 Replies)
Discussion started by: jiam912
23 Replies

3. Shell Programming and Scripting

How to improve an script?

Gents. I have 2 different scripts for the same purpose: raw2csv_1 Script raw2csv_1 finish the process in less that 1 minute raw2csv_2 Script raw2csv_2 finish the process in more that 6 minutes. Can you please check if there is any option to improve the raw2csv_2. To finish the job... (4 Replies)
Discussion started by: jiam912
4 Replies

4. UNIX for Dummies Questions & Answers

How to improve the performance of this script?

Hi , i wrote a script to convert dates to the formate i want .it works fine but the conversion is tkaing lot of time . Can some one help me tweek this script #!/bin/bash file=$1 ofile=$2 cp $file $ofile mydates=$(grep -Po '+/+/+' $ofile) # gets 8/1/13 mydates=$(echo "$mydates" | sort |... (5 Replies)
Discussion started by: vikatakavi
5 Replies

5. Programming

awk processing / Shell Script Processing to remove columns text file

Hello, I extracted a list of files in a directory with the command ls . However this is not my computer, so the ls functionality has been revamped so that it gives the filesizes in front like this : This is the output of ls command : I stored the output in a file filelist 1.1M... (5 Replies)
Discussion started by: ajayram
5 Replies

6. Shell Programming and Scripting

Help need to improve performance :Parallel processing ideas

Hi, Please tell me how to include parallel processing for the below code. Thanks in advance I have a list of users directories in root directory. Each user has a directory by his /her username. I am finding the size of each directorry using du -g command.. and checking if the size exceeds 3GB a... (6 Replies)
Discussion started by: justchill
6 Replies

7. Shell Programming and Scripting

awk, perl Script for processing a single line text file

I need a script to process a huge single line text file: The sample of the text is: "forward_inline_item": "Inline", "options_region_Australia": "Australia", "server_event_err_msg": "There was an error attempting to save", "Token": "Yes", "family": "Family","pwd_login_tab": "Enter Your... (1 Reply)
Discussion started by: hmsadiq
1 Replies

8. Shell Programming and Scripting

Any way to improve performance of this script

I have a data file of 2 gig I need to do all these, but its taking hours, any where i can improve performance, thanks a lot #!/usr/bin/ksh echo TIMESTAMP="$(date +'_%y-%m-%d.%H-%M-%S')" function showHelp { cat << EOF >&2 syntax extreme.sh FILENAME Specify filename to parse EOF... (3 Replies)
Discussion started by: sirababu
3 Replies

9. Shell Programming and Scripting

KSH script -text file processing NULL issues

I'm trying to strip any garbage that may be at the end of my text file and that part is working. The problem only seems to be with the really long lines in the file. When the head command is executed I am directing the output to a new file. The new file always get a null in the 4096 position but... (2 Replies)
Discussion started by: geauxsaints
2 Replies

10. Shell Programming and Scripting

Can I improve this script ???

Hi all, Still a newbie and learning as I go ... as you do :) Have created this script to report on disc usage and I've just included the ChkSpace function this morning. It's the first time I've read a file (line-by-bloody-line) and would like to know if I can improve this script ? FYI - I... (11 Replies)
Discussion started by: Cameron
11 Replies
Login or Register to Ask a Question