How can i speed this script up?


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting How can i speed this script up?
# 1  
Old 03-22-2012
How can i speed this script up?

Hi,

Im quite new to scripting and would like a bit of assistance with trying to speed up the following script. At the moment it is quite slow....

Any way to improve it?


Code:
total=111120
while [ $total -lt 111130 ]
do
total=`expr $total + 1`
INCREMENT=$total
firstline = "blablabla"
secondline = "blablabla" $INCREMENT
Thirdline = "blablabla"
fourthline = "blablabla"
fifthline = "blablabla"
sixthline = "blablabla"
seventhline = "blablabla"

echo $firstline >> newfile.xml
echo $secondline >> newfile.xml
echo $Thirdline >> newfile.xml
echo $fourthline >> newfile.xml
echo $fifthline >> newfile.xml
echo $sixthline >> newfile.xml
echo $seventhline >> newfile.xml

database query which uses file

echo file done

done

Thanks!
# 2  
Old 03-22-2012
Please clarify your final goal : what are you trying to do ?

Did you consider
Code:
cat -n yourfile

?

There are several posts around that talk about xml files generation and/or parsing
You can use the Search tool from menu to look for those posts.

---------- Post updated at 06:57 PM ---------- Previous update was at 06:38 PM ----------

Code:
INCREMENT=111120
while [ $INCREMENT -lt 111130 ]
do
let INCREMENT+=1

echo "blablabla
blablabla $INCREMENT
blablabla
blablabla
blablabla
blablabla
blablabla" >> newfile.xml

# ...

echo "file $INCREMENT done"

done

?
# 3  
Old 03-22-2012
Quote:
Originally Posted by brunlea
Hi,

Im quite new to scripting and would like a bit of assistance with trying to speed up the following script. At the moment it is quite slow....
Do you realize you're never clearing that XML file? It's just growing and growing every loop!

Anyway, you can put all that text in a here-document to avoid 99 echo calls writing to the same file:

Code:
cat <<EOF >file.xml
line1
lne2 $variable
line3
EOF


Instead of running the same thing 9 times to do 9 queries, you could probably run the same thing once to handle all nine queries, tying them directly in with a pipe.


Code:
total=111120
while [ $total -lt 111130 ]
do
        total=`expr $total + 1`

# Note the ending EOF below the last 'blablabla' MUST be at the very
# beginning of the line or the here-document will eat the entire rest of the
# script!
        cat <<EOF
blablabla
blablabla $total
blablabla
blablabla
blablabla
blablabla
blablabla
EOF

        # Print this to stderr so it doesn't end up in your query or what have you
        echo file done >&2
done | database_query_using_stream

What is this 'database query which uses file'? I suspect that's going to be the limiting step here and we can't tell you how to rewrite it without seeing it.
# 4  
Old 03-22-2012
Please post what Operating System and version you have and what Shell you are using.
Please quantify a performance problem with real numbers and mention the hardware specification and other contention for resources which gives a feeling for scale. There's a whole difference between performance problems on a home PC and those on a 10,000 user system.
# 5  
Old 03-23-2012
Using AIX 5.2. Scripting in #!\bin\sh.

Ok - Just to clarify what i am trying to do. I am trying to get the creation of the file to be as efficient as possible. Or there may be another way of achieving my objective without creating a file. Explained below....

The first part of the file variable written to the file should read:

Code:
echo $firstline > newfile.xml

The $firstline, $secondline variable are actually XML tags with information in.
The $thirdline contains an Insert statement to a database
<Insertquery>Insert into newtable<Insertquery>
The $forthline contains the values to enter into the table
<values>'one', '2012-03-23 $INCREMENT', 'message'<values>

I am trying to update a database table which can only be updated through the use of an xml file with all the various tags in and the correct format.

The newfile.xml is the file that is used as the template which updates the database. However, the second value to be entered in the database need to be a unique value. This is why i have added the $INCREMENT in the values.

A java call is made and it used the XML file to insert into a database the values. I need to insert 1000 records however, i think the creation of the file is making it slow at the moment as it is taking 1 second per transaction.
# 6  
Old 03-23-2012
There is not need to create a file. I would use a here document like Corona suggested into a stream. Did you try his suggestion? If you need to use variables for some reason you can also use them inside the here document, if the need to be separate queries then you can also try and replace the cat statement with your database query statement and not use a pipe:

Code:
database query << EOF
....
EOF

The second EOF needs to be at the beginning of the line and you cannot use indentation..

Last edited by Scrutinizer; 03-23-2012 at 07:55 AM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Need to Speed up shell script

Hello, I am basic level shell script developer. I have developed the following script. The shell script basically tracking various files containing certain strings. I am finding options to make the script run more faster. Any help/suggestion would be appreciated :) #! /bin/bash # Greps for... (6 Replies)
Discussion started by: Bhanuprasad
6 Replies

2. Shell Programming and Scripting

Speed up the loop in shell script

Hi I have written a shell script which will test 300 to 500 IPs to find which are pinging and which are not pinging. the script which give output as 10.x.x.x is pining 10.x.x.x. is not pining - - - 10.x.x.x is pining like above. But, this script is taking... (6 Replies)
Discussion started by: kumar85shiv
6 Replies

3. Shell Programming and Scripting

Help me with speed up this script

hey guys i have a perl script wich use to compare hashes but it tookes a long time to do that so i wich i will have the soulition to do it soo fast he is the code <redacted> (1 Reply)
Discussion started by: benga
1 Replies

4. Shell Programming and Scripting

Slow Perl script: how to speed up?

I had written a perl script to compare two files: new and master and get the output of the first file i.e. the first file: words that are not in the master file STRUCTURE OF THE TWO FILES The first file is a series of names ramesh sushil jonga sudesh lugdi whereas the second file (could be... (4 Replies)
Discussion started by: gimley
4 Replies

5. Shell Programming and Scripting

Any trick to speed up script?

Hi Guys, I have a script that I am using to convert some text files to xls files. I create multiple temp. files in the process of conversion. Other than reducing the temp. files, are there any general tricks to help speed up the script? I am running it in the bash shell. Thanks. (6 Replies)
Discussion started by: npatwardhan
6 Replies

6. Filesystems, Disks and Memory

data from blktrace: read speed V.S. write speed

I analysed disk performance with blktrace and get some data: read: 8,3 4 2141 2.882115217 3342 Q R 195732187 + 32 8,3 4 2142 2.882116411 3342 G R 195732187 + 32 8,3 4 2144 2.882117647 3342 I R 195732187 + 32 8,3 4 2145 ... (1 Reply)
Discussion started by: W.C.C
1 Replies

7. Shell Programming and Scripting

Speed up this script!

I have a script that processes a fair amount of data -- say, 25-50 megs per run. I'd like ideas on speeding it up. The code is actually just a preprocessor -- I'm using another language to do the heavy lifting. But as it happens, the preprocessing takes much more time than the final processing... (3 Replies)
Discussion started by: CRGreathouse
3 Replies

8. Shell Programming and Scripting

Help to improve speed of text processing script

Hey together, You should know, that I'am relatively new to shell scripting, so my solution is probably a little awkward. Here is the script: #!/bin/bash live_dir=/var/lib/pokerhands/live for limit in `find $live_dir/ -type d | sed -e s#$live_dir/##`; do cat $live_dir/$limit/*... (19 Replies)
Discussion started by: lorus
19 Replies

9. Shell Programming and Scripting

any way to speed up calculations in bash script

hi i have a script that is taking the difference of multiple columns in a file from a value from a single row..so far i have a loop to do that.. all the data is floating point..fin has the difference between array1 and array2..array1 has 700 x 300= 210000 values and array2 has 700 values.. ... (11 Replies)
Discussion started by: npatwardhan
11 Replies

10. Filesystems, Disks and Memory

dmidecode, RAM speed = "Current Speed: Unknown"

Hello, I have a Supermicro server with a P4SCI mother board running Debian Sarge 3.1. This is the "dmidecode" output related to RAM info: RAM speed information is incomplete.. "Current Speed: Unknown", is there anyway/soft to get the speed of installed RAM modules? thanks!! Regards :)... (0 Replies)
Discussion started by: Santi
0 Replies
Login or Register to Ask a Question