Wow, I did not expect ex to be so much more efficient than sed.
Do you have the total run-time in seconds for the three approaches too, pbluescript?
Sure. These were all submitted to an LSF queue, and each node has a minimum of 8 cores with 2.8Ghz+ Intel Xeon CPUs and 16GB RAM running RHEL 5.3. Here is some extra info about each job, with the actual run time listed:
My method: 195,389 seconds
Max Memory : 5 MB
Max Swap : 266 MB
Max Processes : 5
Max Threads : 6
hergp's method: 209,240 seconds
Max Memory : 2676 MB
Max Swap : 2870 MB
Max Processes : 4
Max Threads : 5
alister's method: 42,573 seconds
Max Memory : 121 MB
Max Swap : 392 MB
Max Processes : 5
Max Threads : 6
When actual run time is used, the awk, ex method looks even better.
---------- Post updated at 08:52 AM ---------- Previous update was at 08:44 AM ----------
Quote:
Originally Posted by Scrutinizer
@pbluescript, is file.txt free format? Could you post a sample?
Sure. The actual commands I ran were slightly different than what I posted as there are two places per line that could be changed, but I only wanted one of them to change.
sed in a for loop version:
alister's version:
Here is a sample of what I started with:
Here is a sample of conversion.csv:
Here is a sample of the final result:
Last edited by Scrutinizer; 06-18-2012 at 10:38 AM..
Reason: code tags
Hi All,
I have a file that I need to be able to find a pattern match on a line, search that line for a text pattern, and replace that text.
An example of 4 lines in my file is:
1. MatchText_randomNumberOfText moreData ReplaceMe moreData
2. MatchText_randomNumberOfText moreData moreData... (4 Replies)
Hi all,
I'm having some trouble with a shell script that I have put together to search our web pages for links to PDFs.
The first thing I did was:
ls -R | grep .pdf > /tmp/dave_pdfs.outWhich generates a list of all of the PDFs on the server. For the sake of arguement, say it looks like... (8 Replies)
Hello,
I really would appreciate some help with a bash script for some string manipulation on an SQL dump:
I'd like to be able to rename "sites/WHATEVER/files" to "sites/SOMETHINGELSE/files" within the sql dump.
This is quite easy with sed:
sed -e... (1 Reply)
Hello All,
Im a Hardware engineer, I have written this script to automate my job. I got stuck in the following location.
CODE:
..
..
...
foreach $key(keys %arr_hash) {
my ($loc,$ind,$add) = split /,/, $arr_hash{$key};
&create_verilog($key, $loc, $ind ,$add);
}
sub create_verilog{... (2 Replies)
Hi all,
I have problem with searching hundreds of CSV files, the problem is that search is lasting too long (over 5min).
Csv files are "," delimited, and have 30 fields each line, but I always grep same 4 fields - so is there a way to grep just those 4 fields to speed-up search.
Example:... (11 Replies)
Dear All,
i want to search particular string and want to replance next line value.
following is the test file.
search string is
tmp,???
,10:1 "???" may contain any 3 character it should remain the same and next line replace with ,10:50
tmp,123 --- if match tmp,??? then... (3 Replies)
Hi, could anyone help me with this, tried several times but still not getting it right or having enough grounding to do it outside of javascript: Using awk or sed or bash: need to go through a text file using a for next loop, replacing substrings in the file that consist of a potentially multi... (3 Replies)
Hi Team,
I am new to unix, please help me in this.
I have a file named properties.
The content of the file is :
##Mobile props
east.url=https://qa.east.corp.com/prop/end
west.url=https://qa.west.corp.com/prop/end
south.url=https://qa.south.corp.com/prop/end... (2 Replies)
Hi all,
I have a lookup table from which I am looking up values (from col1) and replacing them by corresponding values (from col2) in another file.
lookup file
a,b
c,d
So just replace a by b, and replace c by d.
mainfile
a,fvvgeggsegg,dvs
a,fgeggefddddddddddg... (7 Replies)
This is my first experience writing unix script. I've created the following script. It does what I want it to do, but I need it to be a lot faster. Is there any way to speed it up?
cat 'Tax_Provision_Sample.dat' | sort | while read p; do fn=`echo $p|cut -d~ -f2,4,3,8,9`; echo $p >> "$fn.txt";... (20 Replies)