Script Optimization - large delimited file, for loop with many greps


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Script Optimization - large delimited file, for loop with many greps
# 8  
Old 04-21-2011
Just noticed this:

Code:
PARTNUM6546-048

If this is supposed to be one record, you'll need to change FS slightly.

Code:
# won't accept -
# FS="[^a-zA-Z0-9_]"
# should accept -
FS="[^a-zA-Z0-9_\\-]"

# 9  
Old 04-21-2011
Nosey neighbor

I don't have anything useful to add over what Corona gave you, but, I am very interested to hear what the results are on your time savings. Smilie

Something tells me it will be significant. Guesstimate is that it looks to be roughly 150mb of raw data (3M * 50 char per line). I've ran similar on terabytes worth of data and it only took me an hour or so.
# 10  
Old 04-21-2011
You brought up a great point Corona, there are "." and " " and "-" in user entered fields that I don't want awk to use as a delimiter ... I haven't yet been able to figure out the syntax for including these characters in the FS statement.

I'm experimenting so that I can understand HOW awk works, so that I can use the same type of logic for some of the other parsing tasks I have in my original script

I'm still working on the IF statements (to grab just the first occurence)

But I really like this solution ... I agree this will save hours of processing time

---------- Post updated at 06:24 PM ---------- Previous update was at 06:15 PM ----------

It works great BTW!
# 11  
Old 04-21-2011
Quote:
Originally Posted by verge
You brought up a great point Corona, there are "." and " " and "-" in user entered fields that I don't want awk to use as a delimiter ... I haven't yet been able to figure out the syntax for including these characters in the FS statement.
Just add them, if they don't work, put a \\ in front of them. Easy enough to test them by echo "a-b-c-d-e-f| awk 'BEGIN { FS="..." } { print $1,$2,$3; }'
Quote:
I'm experimenting so that I can understand HOW awk works
If you followed the logic for my example, you're already over the main hurdle, understanding that awk has its own built-in "while" loop to read in data record by record based on RS(usually newline) then split into tokens based on FS(usually whitespace). You can solve quite a few tasks by just by carefully adjusting its input and output settings.

But you can also cavalierly ignore them whenever you please and just deal with whole lines via $0, print arbitrary anything with printf, read an extra line into $0 with getline, etc.

Other mindblowers:
  • N gives you the variable, $N gives you the record number. Say you did N=4; V=$N, that's effectively V=$4 This makes it easy to get the last field(NF is the number of fields, $NF is the last field), loop over fields for(N=1; N<=NF; N++) printf("%s\n", $N); etc. $0 is the entire record (usually line).
  • You can actually modify the $1,$2,... special variables! And the value of $0 will change itself to match. And vice versa, so you can, say, do substitutions inside $0 and end up with different tokens. Or do one little change to $3 then print the entire modified line just by $3=toupper($3); print;
  • Arrays can have strings as indexes, making it easy to do word counting. { for (N=1; N<=NF; N++) word[$N]++;} END { for(keys in words) printf("%s %d\n", key, words[key]); }
...and lots more. My own understanding of awk is far from complete.

Last edited by Corona688; 04-21-2011 at 11:20 PM..
# 12  
Old 04-26-2011
Thanks again Corona ... I'm still having trouble with excluding - as a delimiter in

Code:
FS="[^a-zA-Z0-9\\_\\.\\/\\-\\ ]"
 
I've tried \\-, \-, -, '-'

---------- Post updated at 01:56 PM ---------- Previous update was at 01:52 PM ----------

Thanks again for your help ... the difference in processing time reduced greatly ... from almost 6 hours to 20 minutes ... for approx over 3 million lines in about 35K lines

---------- Post updated at 02:21 PM ---------- Previous update was at 01:56 PM ----------

I have a slightly different issue with delimiters. I've been isolating and replacing the delimiters with grep, cut and sed but I know there is a much better way to do this with awk or even perl.

In my file the token seperator can be any non-word character, however the line terminator will also be a non-word character (the rule dictates that these two characters must be different). To add to the issue, user entered fields can include non-word characters, as long as those characters aren't the ones used for the token/line seperators.

Code:
Example record
ISA¼3213¼part-number¼address~GS¼56756¼control{number~ST¼09898~
 
Into 
ISA|3213|part-number|address
GS|56756|control{number
ST|09898

I need to replace the token seperator ¼ with a more standard delimiter like |
I need to replace line terminator ~ with line feed (0D0A)

However ¼ and ~ change from record to record and there are thousands of them in a file. I've csplit the files such that each file has the same token/field seperator and line terminator.

Let me know if I can explain this more clearly!
# 13  
Old 04-26-2011
To exclude - as a delimiter, don't put it in there. (things like 0-9 don't count.)

Is there or is there not a newline already where ~ is?

How did you csplit them?
# 14  
Old 04-26-2011
there is no newline where ~ is ... I csplit by the record header "ISA"

... most are on a newline
... but if the file has less than three lines, I know that the record has this issue
... "ISA" always has a non-word character before and after it
... the line terminator is always in position 107 (in my file)

in the shell version of my script, I used an if and sed to put ISAs on its own line and then csplit the file by ISA

I'm not sure how to cleverly add a newline where ~
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Need Optimization shell/awk script to aggreagte (sum) for all the columns of Huge data file

Optimization shell/awk script to aggregate (sum) for all the columns of Huge data file File delimiter "|" Need to have Sum of all columns, with column number : aggregation (summation) for each column File not having the header Like below - Column 1 "Total Column 2 : "Total ... ...... (2 Replies)
Discussion started by: kartikirans
2 Replies

2. UNIX for Advanced & Expert Users

Need optimized awk/perl/shell to give the statistics for the Large delimited file

I have a file size is around 24 G with 14 columns, delimiter with "|" My requirement- can anyone provide me the fastest and best to get the below results Number of records of the file First column and second Column- Unique counts Thanks for your time Karti ------ Post updated at... (3 Replies)
Discussion started by: kartikirans
3 Replies

3. Shell Programming and Scripting

Tab Delimited file in loop

Hi, I have requirement to create tab delimited file with values coming from variables. File will contain only two columns separated by tab. Header will be added once. Values will be keep adding upon the script run. If values already exists then values will be replaced. I have done so... (1 Reply)
Discussion started by: sukhdip
1 Replies

4. Shell Programming and Scripting

Need a script to convert comma delimited files to semi colon delimited

Hi All, I need a unix script to convert .csv files to .skv files (changing a comma delimited file to a semi colon delimited file). I am a unix newbie and so don't know where to start. The script will be scheduled using cron and needs to convert each .csv file in a particular folder to a .skv... (4 Replies)
Discussion started by: CarpKing
4 Replies

5. Shell Programming and Scripting

Removing dupes within 2 delimited areas in a large dictionary file

Hello, I have a very large dictionary file which is in text format and which contains a large number of sub-sections. Each sub-section starts with the following header : #DATA #VALID 1 and ends with a footer as shown below #END The data between the Header and the Footer consists of... (6 Replies)
Discussion started by: gimley
6 Replies

6. Shell Programming and Scripting

help with a shell script that greps an error from the logs

Hello everyone. I wrote the following script but the second part is not excecuting. It is not sending the notification by email if the error occurs. the send mail is working so i think the errorr should be in the if statement LOGDIR=/logs/out LOG=`date "+%Y%m%d"`.LOG-FILE.out #the log file ... (11 Replies)
Discussion started by: adak2010
11 Replies

7. Shell Programming and Scripting

Extracting a portion of data from a very large tab delimited text file

Hi All I wanted to know how to effectively delete some columns in a large tab delimited file. I have a file that contains 5 columns and almost 100,000 rows 3456 f g t t 3456 g h 456 f h 4567 f g h z 345 f g 567 h j k lThis is a very large data file and tab delimited. I need... (2 Replies)
Discussion started by: Lucky Ali
2 Replies

8. Shell Programming and Scripting

Large pipe delimited file that I need to add CR/LF every n fields

I have a large flat file with variable length fields that are pipe delimited. The file has no new line or CR/LF characters to indicate a new record. I need to parse the file and after some number of fields, I need to insert a CR/LF to start the next record. Input file ... (2 Replies)
Discussion started by: clintrpeterson
2 Replies

9. UNIX for Dummies Questions & Answers

Command that creates file and also greps that file?

I have a command that does something and then creates a log file (importlog.xml). I then want to grep that newly created log (importlog.xml) file for a certain word (success). I then want to write that grep result to a new file (success.log). So far I can run the command which creates the... (2 Replies)
Discussion started by: Sepia
2 Replies

10. Shell Programming and Scripting

Directory sizes loop optimization

I have the following script: #!/usr/bin/ksh export MDIR=/datafiles NAME=$1 SERVER=$2 DIRECTORY=$3 DATABASE=$4 ID=$5 export dirlist=`/usr/bin/ssh -q $ID@$SERVER find $DIRECTORY -type d -print` for dir in $dirlist do SIZE=`</dev/null /usr/bin/ssh -q $ID@$SERVER du -ks $dir` echo... (6 Replies)
Discussion started by: la_womn
6 Replies
Login or Register to Ask a Question