Sponsored Content
Full Discussion: Help speeding up script
Top Forums Shell Programming and Scripting Help speeding up script Post 302942369 by neutronscott on Tuesday 28th of April 2015 09:14:07 AM
Old 04-28-2015
It's because you're running an external program for each line.

Here is an awk solution:

Code:
sort Tax_Provision_Sample.dat | awk -F~ '{f=$2 FS $4 FS $3 FS $8 FS $9 ".txt"; print $0 >> f; close(f);}'


Last edited by neutronscott; 04-28-2015 at 01:31 PM.. Reason: >>
This User Gave Thanks to neutronscott For This Post:
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

speeding up the compilation on SUN Solaris environment

Dear friends, Please let me know how do I increase the speed of my compilation in SUN Solaris environment. actually I have many subfolders which contains .cc files. when I compile makefile at the root it will take much time to compile all the subfolders and generates object(.o) files. Can... (2 Replies)
Discussion started by: swamymns
2 Replies

2. Shell Programming and Scripting

Speeding up processing a file

Hi guys, I'm hoping you can help me here. I've knocked up a script that looks at a (huge) log file, and pulls from each line the hour of each transaction and how long each transaction took. The data is stored sequentially as: 07:01 blah blah blah 12456 blah 07:03 blah blah blah 234 blah... (4 Replies)
Discussion started by: dlam
4 Replies

3. UNIX for Dummies Questions & Answers

Speeding up a Shell Script (find, grep and a for loop)

Hi all, I'm having some trouble with a shell script that I have put together to search our web pages for links to PDFs. The first thing I did was: ls -R | grep .pdf > /tmp/dave_pdfs.outWhich generates a list of all of the PDFs on the server. For the sake of arguement, say it looks like... (8 Replies)
Discussion started by: Dave Stockdale
8 Replies

4. UNIX for Dummies Questions & Answers

Speeding/Optimizing GREP search on CSV files

Hi all, I have problem with searching hundreds of CSV files, the problem is that search is lasting too long (over 5min). Csv files are "," delimited, and have 30 fields each line, but I always grep same 4 fields - so is there a way to grep just those 4 fields to speed-up search. Example:... (11 Replies)
Discussion started by: Whit3H0rse
11 Replies

5. Shell Programming and Scripting

speeding up bash script with "while read line"

Hello everybody, I'm still slowly treading my way into bash scripting (without any prior programming experience) and hence my code is mostly what some might call "creative" if they meant well :D I have created a script that serves its purpose but it does so very slowly, since it needs to work... (4 Replies)
Discussion started by: origamisven
4 Replies

6. Shell Programming and Scripting

Speeding up search and replace in a for loop

Hello, I am using sed in a for loop to replace text in a 100MB file. I have about 55,000 entries to convert in a csv file with two entries per line. The following script works to search file.txt for the first field from conversion.csv and then replace it with the second field. While it works fine,... (15 Replies)
Discussion started by: pbluescript
15 Replies

7. Shell Programming and Scripting

Speeding up substitutions

Hi all, I have a lookup table from which I am looking up values (from col1) and replacing them by corresponding values (from col2) in another file. lookup file a,b c,d So just replace a by b, and replace c by d. mainfile a,fvvgeggsegg,dvs a,fgeggefddddddddddg... (7 Replies)
Discussion started by: senhia83
7 Replies

8. Shell Programming and Scripting

Speeding up shell script with grep

HI Guys hoping some one can help I have two files on both containing uk phone numbers master is a file which has been collated over a few years ad currently contains around 4 million numbers new is a file which also contains 4 million number i need to split new nto two separate files... (4 Replies)
Discussion started by: dunryc
4 Replies

9. Shell Programming and Scripting

Help 'speeding' up this 'parsing' script - taking 24+ hours to run

Hi, I've written a ksh script that read a file and parse/filter/format each line. The script runs as expected but it runs for 24+ hours for a file that has 2million lines. And sometimes, the input file has 10million lines which means it can be running for more than 2 days and still not finish.... (9 Replies)
Discussion started by: newbie_01
9 Replies

10. Shell Programming and Scripting

Help with speeding up my working script to take less time - how to use more CPU usage for a script

Hello experts, we have input files with 700K lines each (one generated for every hour). and we need to convert them as below and move them to another directory once. Sample INPUT:- # cat test1 1559205600000,8474,NormalizedPortInfo,PctDiscards,0.0,Interface,BG-CTA-AX1.test.com,Vl111... (7 Replies)
Discussion started by: prvnrk
7 Replies
NOWEB(1)						      General Commands Manual							  NOWEB(1)

NAME
noindex - build external index for noweb document SYNOPSIS
noindex basename[.tex] DESCRIPTION
noindex looks through LaTeX .aux files for identifiers that should go in a noweb external index. It sorts all identifiers and writes the results on basename.nwi. NOWEB INDEXING STRATEGIES
A noweb program consists of one or more files. In the simple case, these files are run through noweave together, to produce a single LaTeX file. noweave -index suffices to produce an index and cross-referencing information; neither nodefs nor noindex is required. When a noweb program consists of several source files, it is often better to run each source file through noweave to produce its own LaTeX file, then use noindex to produce an external index. This technique has several advantages: The line numbers in the LaTeX files correspond to the line numbers in the source files, so it is easier to diagnose LaTeX errors. The LaTeX includeonly feature can be used, making it possible to format parts of large programs while retaining complete cross-ref- erence information. When used with make(1), the technique avoids running noweave over source files that have not changed. Using the external index places fewer demands on LaTeX's memory, making it read its .aux files much more quickly. The disadvantages are that nodefs and noindex are needed for full cross-referencing and a properly sorted index. EXAMPLE
This example assumes a noweb program of three source files: a.nw , b.nw , and c.nw. The file doc.tex is assumed to contain LaTeX boiler- plate, including the commands oweboptions{externalindex} include{a} include{b} include{c} The first sequence of steps is to create a file listing all the identifiers defined anywhere in a, b, or c. nodefs a.nw > a.defs nodefs b.nw > b.defs nodefs c.nw > c.defs sort -u a.defs b.defs c.defs | cpif all.defs Using sort -u and cpif(1) avoids changing all.defs unless the set of identifiers changes. This technique, used in a Makefile, avoids unnecessary rebuilding. The next series of steps is to create LaTeX files with full cross-reference information for all identifiers. noweave -n -indexfrom all.defs a.nw > a.tex noweave -n -indexfrom all.defs b.nw > b.tex noweave -n -indexfrom all.defs c.nw > c.tex The final steps run LaTeX once to create .aux files, then noindex to create the index, then LaTeX again to format the complete document. latex doc noindex doc latex doc In a Makefile, noindex can be run before every invocation of LaTeX. BUGS
noindex is distributed in awk and Icon versions. The awk version is slow and does a poorer job sorting. There is no comparable machinery to make it possible to use multiple files with the HTML back end. SEE ALSO
noweave(1), nodefs(1), cpif(1) VERSION
This man page is from noweb version 2.11b. AUTHOR
Norman Ramsey, Harvard University. Internet address nr@eecs.harvard.edu. Noweb home page at http://www.eecs.harvard.edu/~nr/noweb. local 3/28/2001 NOWEB(1)
All times are GMT -4. The time now is 05:19 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy