I'm creating a lot of test data for some performance testing coming up. The vendor product I used to create the file had a slight bug in it and got some times wrong so I decided to use sed to fix it.
This works on the command line but not in a shell script and I can't work out why.
File to be changed. I've added the $ sign as the termnal isn't wide enough.
I want to change 039914, for eg, to 100001 and say 039915 to 102000
so as expected sed ''s/039914/100001/'' file.dat > file1.dat works and I get what I want.
In the shell script , given I have about 2,000 substitutions to make , I cat a file with the substitution
eg
039914 100001
039915 102000
Then the script, again dead simple
As expected I should be running sed "s/039914/100001/" at the bold line but for some reason it removes everything up to ANZ rather than substituting . A -xv gives a weird sed command which is what is confusing me.
eg
+ sed -n 1 p cnt1.txt
replace=100100
echo 039914 100100
echo "Replacing 039914 with 100100"
+
/g file.dat 9914/100100
If I got it right, you have problems with the variable substitution inside sed.
You can write it like this:
That wasn't quite the answer but got me thinking thanks
for some reason , not sure why, the backticks in the variable creation were interfering with the sed. Removing the backticks and suddenly it works.
I have no idea why though.
[code]
#!/bin/sh
rm b1.dat
touch b1.dat
j=`wc -l cnt1.txt|awk '{FS=" "};{print $1}'`
cnt=1
while [ $cnt -le $j ] ; do match=`sed -n "$cnt p" cnt1.txt |awk '{FS=" "};{print $1}'` replace=`sed -n "$cnt p" cnt1.txt |awk '{FS=" "};{print $2}'`
sed -e "s/$match/$replace/g" file.dat >> b1.dat
#(I've tried a bazillion different ways to do this line and even setenv at the #command prompt works)
cnt=`expr $cnt + 1`
done
Hi, what would be the first things to check on a system that normally works fine, and is not so fine this morning ? Its accessing menus and various other screens 100x slower than normal.
Version: UnixWare 5 7.1.3 i386 SCO UNIX_SVR5
I have tried this pf -ef|grep paulc and found a huge list of... (6 Replies)