Sponsored Content
Top Forums Shell Programming and Scripting Script Optimization - large delimited file, for loop with many greps Post 302516119 by verge on Thursday 21st of April 2011 04:55:00 PM
Old 04-21-2011
Script Optimization - large delimited file, for loop with many greps

Since there are approximately 75K gsfiles and hundreds of stfiles per gsfile, this script can take hours. How can I rewrite this script, so that it's much faster? I'm not as familiar with perl but I'm open to all suggestions.

Code:
ls file.list>$split
for gsfile in `cat $split`;
do
  csplit -ks -n 6 -f $gsfile.ST $gsfile /^ST/ {100000} 2>>$diagnostic
  for stfile in `ls $gsfile.ST*|sort -n`; 
  do 
    delim=`LC_ALL=C grep "^GS" $gsfile|cut -c3` 2>>$diagnostic
  gscode=`LC_ALL=C grep "^GS" $gsfile|cut -d "$delim" -f3` 2>>$diagnostic
  supcd=`LC_ALL=C grep "^N1.SU" $stfile|cut -d "$delim" -f5|head -1` 2>>$diagnostic
    sellcd=`LC_ALL=C grep "^N1.SE" $stfile|cut -d "$delim" -f5|head -1` 2>>$diagnostic
    firponum=`LC_ALL=C grep "^IT1" $stfile|cut -d "$delim" -f10|head -1` 2>>$diagnostic
    invtl=`LC_ALL=C grep "^TDS" $stfile|cut -d "$delim" -f2|tr -cd '[[:digit:]]'` 2>>$diagnostic 
    #I have about ten more greps here
    echo "$gscode,$supcd,$sellcd,$firponum,$invtl">>$detail_file
    rm -f $stfile 2>>$diagnostic                                                                                          
  done 
done

Here's an example of an input file. The delimiters can be any non-word character.
Code:
 
gsfile_1
GS*IN*TPU*TPM*110303*0634*65433*X*002000 
ST*810*0001  
N1*SU*TPUNAME*92*TPUCD21 
N1*SE*SELNAME*92*789 
IT1*1*8*EA*909234.12**BP*PARTNUM123*PO*PONUM342342*PL*526 
IT1*2*3*EA*53342.65**BP*PARTNUM456*PO*PONUM31131*PL*528 
TDS*32424214  
SE*7*0001
ST*810*0002  
N1*SU*TPUNAME*92*TPUCD43 
N1*SE*SELNAME*92*543 
DTM*011*110302 
IT1*1*10*EA*909234.12**BP*PARTNUM575*PO*PONUM1253123*PL*001  
IT1*2*15*EA*53342.65**BP*PARTNUM483*PO*PONUM646456*PL*002 
TDS*989248095 
SE*8*0002 
GE*2*65433
gs_file2
GS~IN~TPT~TPM~110302~2055~2321123~X~003010~
ST~810~000027324~
N1~SU~TPMNAME~92~TPUCD87
N1~SE~SELMNAME~92~23234
IT1~001~3450~EA~1234.67~~BP~PARTNUM6546-048~PO~PONUM99484~PL~235~
TDS~425961150~
SE~6~2321123~
GE~1~3201~

output should look like this ...
TPU,TPUCD21,789,PONUM342342,32424214
TPU,TPUCD43,543,PONUM1253123,989248095
TPT,TPUCD87,23234,PONUM99484,425961150

I hope this isn't too long! I'm new and not yet familiar with the forum posting style. Thanks so much for your help.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Directory sizes loop optimization

I have the following script: #!/usr/bin/ksh export MDIR=/datafiles NAME=$1 SERVER=$2 DIRECTORY=$3 DATABASE=$4 ID=$5 export dirlist=`/usr/bin/ssh -q $ID@$SERVER find $DIRECTORY -type d -print` for dir in $dirlist do SIZE=`</dev/null /usr/bin/ssh -q $ID@$SERVER du -ks $dir` echo... (6 Replies)
Discussion started by: la_womn
6 Replies

2. UNIX for Dummies Questions & Answers

Command that creates file and also greps that file?

I have a command that does something and then creates a log file (importlog.xml). I then want to grep that newly created log (importlog.xml) file for a certain word (success). I then want to write that grep result to a new file (success.log). So far I can run the command which creates the... (2 Replies)
Discussion started by: Sepia
2 Replies

3. Shell Programming and Scripting

Large pipe delimited file that I need to add CR/LF every n fields

I have a large flat file with variable length fields that are pipe delimited. The file has no new line or CR/LF characters to indicate a new record. I need to parse the file and after some number of fields, I need to insert a CR/LF to start the next record. Input file ... (2 Replies)
Discussion started by: clintrpeterson
2 Replies

4. Shell Programming and Scripting

Extracting a portion of data from a very large tab delimited text file

Hi All I wanted to know how to effectively delete some columns in a large tab delimited file. I have a file that contains 5 columns and almost 100,000 rows 3456 f g t t 3456 g h 456 f h 4567 f g h z 345 f g 567 h j k lThis is a very large data file and tab delimited. I need... (2 Replies)
Discussion started by: Lucky Ali
2 Replies

5. Shell Programming and Scripting

help with a shell script that greps an error from the logs

Hello everyone. I wrote the following script but the second part is not excecuting. It is not sending the notification by email if the error occurs. the send mail is working so i think the errorr should be in the if statement LOGDIR=/logs/out LOG=`date "+%Y%m%d"`.LOG-FILE.out #the log file ... (11 Replies)
Discussion started by: adak2010
11 Replies

6. Shell Programming and Scripting

Removing dupes within 2 delimited areas in a large dictionary file

Hello, I have a very large dictionary file which is in text format and which contains a large number of sub-sections. Each sub-section starts with the following header : #DATA #VALID 1 and ends with a footer as shown below #END The data between the Header and the Footer consists of... (6 Replies)
Discussion started by: gimley
6 Replies

7. Shell Programming and Scripting

Need a script to convert comma delimited files to semi colon delimited

Hi All, I need a unix script to convert .csv files to .skv files (changing a comma delimited file to a semi colon delimited file). I am a unix newbie and so don't know where to start. The script will be scheduled using cron and needs to convert each .csv file in a particular folder to a .skv... (4 Replies)
Discussion started by: CarpKing
4 Replies

8. Shell Programming and Scripting

Tab Delimited file in loop

Hi, I have requirement to create tab delimited file with values coming from variables. File will contain only two columns separated by tab. Header will be added once. Values will be keep adding upon the script run. If values already exists then values will be replaced. I have done so... (1 Reply)
Discussion started by: sukhdip
1 Replies

9. UNIX for Advanced & Expert Users

Need optimized awk/perl/shell to give the statistics for the Large delimited file

I have a file size is around 24 G with 14 columns, delimiter with "|" My requirement- can anyone provide me the fastest and best to get the below results Number of records of the file First column and second Column- Unique counts Thanks for your time Karti ------ Post updated at... (3 Replies)
Discussion started by: kartikirans
3 Replies

10. UNIX for Advanced & Expert Users

Need Optimization shell/awk script to aggreagte (sum) for all the columns of Huge data file

Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file File delimiter "|" Need to have Sum of all columns, with column number : aggregation (summation) for each column File not having the header Like below - Column 1 "Total Column 2 : "Total ... ...... (2 Replies)
Discussion started by: kartikirans
2 Replies
cut(1)							      General Commands Manual							    cut(1)

Name
       cut - cut out selected fields of each line of a file

Syntax
       cut -clist [file1 file2...]
       cut -flist [-dchar] [-s] [file1 file2...]

Description
       Use  the  command to cut out columns from a table or fields from each line of a file.  The fields as specified by list can be fixed length,
       that is, character positions as on a punched card (-c option), or the length can vary from line to line and be marked with a  field  delim-
       iter character like tab (-f option).  The command can be used as a filter.  If no files are given, the standard input is used.

       Use to make horizontal ``cuts'' (by context) through a file, or to put files together in columns.  To reorder columns in a table, use and

Options
       list	   Specifies  ranges  that must be a comma-separated list of integer field numbers in increasing order.  With optional - indicates
		   ranges as in the -o option of nroff/troff for page ranges; for example, 1,4,7; 1-3,8; -5,10 (short for 1-5,10);  or	3-  (short
		   for third through last field).

       -clist	   Specifies character positions to be cut out.  For example, -c1-72 would pass the first 72 characters of each line.

       -flist	   Specifies  the  fields  to be cut out.  For example, -f1,7 copies the first and seventh field only.	Lines with no field delim-
		   iters are passed through intact (useful for table subheadings), unless -s is specified.

       -dchar	   Uses the specified character as the field delimiter.  Default is tab.  Space or other characters with special  meaning  to  the
		   shell must be quoted.  The -d option is used only in combination with the -f option, according to XPG3 and SVID2/SVID3.

       -s	   Suppresses  lines  with  no	delimiter  characters.	 Unless  specified, lines with no delimiters are passed through untouched.
		   Either the -c or -f option must be specified.

Examples
       Mapping of user IDs to names:
       cut -d: -f1,5 /etc/passwd
       To set name to the current login name for the csh shell:
       set name=`who am i | cut -f1 -d" "`
       To set name to the current login name for the sh, sh5, and ksh shells:
       name=`who am i | cut -f1 -d" "`

Diagnostics
       "line too long"	   A line can have no more than 511 characters or fields.

       "bad list for c/f option"
			   Missing -c or -f option or incorrectly specified list.  No error occurs if a line has fewer fields than the list  calls
			   for.

       "no fields"	   The list is empty.

See Also
       grep(1), paste(1)

																	    cut(1)
All times are GMT -4. The time now is 05:50 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy