sed working slow on big files


 
Thread Tools Search this Thread
Top Forums UNIX for Advanced & Expert Users sed working slow on big files
# 1  
Old 02-15-2011
sed working slow on big files

HI Experts ,

I'm using the following code to remove spaces appearing at the end of the file.

Code:
 
sed "s/[ ]*$//g" <filename> > <new_filename>
 
mv <new_filename> <filename>

this is working fine for volumes upto 20-25 GB.

for the bigger files it is taking more time that it is required for the original file generation. :-)

I need to process files which will be 4-5 times bigger than this.

Please suggest a faster way.
# 2  
Old 02-15-2011
Not sure how much this buys you but: sed 's/ *$//' file > file1 seems like it should be a little faster.
# 3  
Old 02-16-2011
Thanks ! I can see slight saving on time.

but , is there any other way to do it ? any other command than sed ?
# 4  
Old 02-16-2011
Some process must write the file. Rewrite the process to omit the trailing spaces. Or pipe that process through the sed command as the file is written. As for faster, maybe perl:
Code:
perl -pe 's/ *$//'

. For real speed a custom c program is needed. But not writing the spaces to start with would be optimum.
This User Gave Thanks to Perderabo For This Post:
# 5  
Old 02-17-2011
Thank you!

you are right. one last question is cut faster than sed ?

to avoid spaces I have modified the code but getting 2 junk characters at the beginning of every line . want to use
Code:
cut -c 3- <filename>

to remove those.
# 6  
Old 02-18-2011
Quote:
to avoid spaces I have modified the code but getting 2 junk characters at the beginning of every line
What software is producing this file? Might be easier to fix at source?

Can you post say four sample lines with control codes visible. Just wondering if this not a proper unix text file.
Code:
sed -n l four_sample_lines.txt


Last edited by methyl; 02-18-2011 at 07:48 PM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Using sed command to replace "|" with ^ for all *.dat files in a folder not working

I am trying to use the below sed command to replace all "|" to ^, in a folder had 50 dat files. when i tried with 1 file it worked but when i tried with wild card, is not working. sed -i 's/"|"/\^/g' *.dat Is this the proper way to use sed command thank you very much for help. (3 Replies)
Discussion started by: cplusplus1
3 Replies

2. Shell Programming and Scripting

Improve script - slow process with big files

Gents, Please can u help me to improve this script to be more faster, it works perfectly but for big files take a lot time to end the job.. I see the problem is in the step (while) and in this part the script takes a lot time.. Please if you can find a best way to do will be great. ... (13 Replies)
Discussion started by: jiam912
13 Replies

3. Shell Programming and Scripting

sed Very Slow

Hi We are using sed to clean up a file of a pattern and its talking a lot of time on XML output file The command that we are using is sed -e "s/tns1://g" $OUTPUTFILENM > $TEMPFILE Where $OUTPUTFILENM is the file to be cleaned and $TEMPFILE is the cleaned output Can you... (3 Replies)
Discussion started by: jimmyb
3 Replies

4. Shell Programming and Scripting

Why is SED so slow?

I have many files which contain about two million lines. Now I want to use sed to delete the 9th line and add a new line behind the 8th line. I use the command as follows: for((i=1;i<100;i++)); do echo $i; sed -i '9d' $i.dat; sed -i '8a this is a new line' $i.dat; done But it is... (3 Replies)
Discussion started by: wxuyec
3 Replies

5. Shell Programming and Scripting

Very big text file - Too slow!

Hello everyone, suppose there is a very big text file (>800 mb) that each line contains an article from wikipedia. Each article begins with a tag (<..>) containing its url. Currently there are 10^6 articles in the file. I want to take random N articles, eliminate all non-alpharithmetic... (14 Replies)
Discussion started by: fedonMan
14 Replies

6. Shell Programming and Scripting

sed of big html files

hi friends, i have to cut a large html file between tag " <!-- DEFACEMENTS ROWS -->" "<!-- DISCLAIMER FOOTER -->" and store cut data in other file please help me!!!! (2 Replies)
Discussion started by: praneshbmishra
2 Replies

7. AIX

How to send big files over slow network?

Hi, I am trying to send oracle archives over WAN and it is taking hell a lot of time. To reduce the time, I tried to gzip the files and send over to the other side. That seems to reduce the time. Does anybody have experienced this kind of problem and any possible ways to reduce the time. ... (1 Reply)
Discussion started by: giribt
1 Replies

8. Shell Programming and Scripting

Big (at least to me) sed proble

hi all, i am again surrounded by a big problem,,, i have 2 files file1.txt file2.txt aaaa xxxx xxxxx xxxxxxxxxxxxxxx zzzz zzzz zzz bbb aaaa xx xxxx xxxx xxx zzzz zzzz zzz... (1 Reply)
Discussion started by: go4desperado
1 Replies

9. Shell Programming and Scripting

bash script working for small size files but not for big size files.

Hi, I have one file stat. Stat file contents are as follows: for example. H50768020040913,00260100,507680,13,0000000643,0000000643,00000,0000 H50769520040808,00260100,507695,13,0000000000,0000000000,00000,0000 H50770620040611,00260100,507706,13,0000000000,0000000000,00000,0000 Now i... (1 Reply)
Discussion started by: davidpreml
1 Replies

10. Shell Programming and Scripting

awk not working as expected with BIG files ...

I am facing some strange problem. I know, there is only one record in a file 'test.txt' which starts with 'X' I ensure that with following command, awk /^X/ test.txt | wc -l This gives me output = '1'. Now I take out this record out of the file, as follows : awk /^X/ test.txt >... (1 Reply)
Discussion started by: videsh77
1 Replies
Login or Register to Ask a Question