Sponsored Content
Top Forums Shell Programming and Scripting Severe performance issue while 'grep'ing on large volume of data Post 302502291 by Corona688 on Monday 7th of March 2011 02:54:46 PM
Old 03-07-2011
Quote:
Originally Posted by alister
An FS=\| is necessary to properly split that file.
Good point.
Quote:
Testing with the only version of grep that I have that supports -o (gnu grep 2.5.1 on a disused laptop that seldom sees any action)
Nuts, I thought that was a portable option. Throw that whole script out, then.
Quote:
if multiple patterns match a single line (not knowing anything about the content of the files, it's a possibility), it does not print the filename before every match (only the first), even with -H.
I did note it expected one pattern per line.
Quote:
Actually, assuming the splitting on file-1 is done correctly and $2 is sent to the temp file, the grepping is being done on the object names
Argh. My script doesn't work, then.

Looking less and less like there's really going to be an efficient solution if you have to parse that thing the hard way line by individual line on systems and shells with no features.

---------- Post updated at 01:31 PM ---------- Previous update was at 12:49 PM ----------

OK, here's a brute-force version in awk:

Code:
BEGIN   {       FS="|"  ;       OFS="|"
                # Read in list of ID's
                while(getline < "file-1")       n[$2]=$1;
        }

{
        for(i in n)     if(index(i, $0) > 0)
                o[FILENAME "|" n[i]]++;
}

END     {       for(k in o)     print k, o[k];  }

Don' think it'll be quite as efficient as grep but ought to do. It ran in two seconds on 5,000 files cached, maybe 5-10 seconds uncached. I don't believe I used any GNU specific features either.

Put that in extract.awk and run with
Code:
xargs awk -f extract.awk < file-2

You can change o[FILENAME "|" n[i]]++; to o[FILENAME "|" i]++; if I somehow got name vs ID backwards again.

---------- Post updated at 01:54 PM ---------- Previous update was at 01:31 PM ----------

Estimate that to work in 60 minutes on data similar to yours on a 'fast' machine. use 'xargs -n 10' to prevent it from eating all your memory. Maybe not so good. If there was some sort of pattern to the OIDs/names, that'd help a lot for finding them without having to brute-force check all 50,000 individually... Or if you knew for a fact you had or could get GNU grep, and didn't have more than one object per line, grep's hardwired performance is going to be way better than awk's scripted performance...

Last edited by Corona688; 03-07-2011 at 04:03 PM..
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

grep'ing and sed'ing chunks in bash... need help on speeding up a log parser.

I have a file that is 20 - 80+ MB in size that is a certain type of log file. It logs one of our processes and this process is multi-threaded. Therefore the log file is kind of a mess. Here's an example: The logfile looks like: "DATE TIME - THREAD ID - Details", and a new file is created... (4 Replies)
Discussion started by: elinenbe
4 Replies

2. Shell Programming and Scripting

Performance issue in UNIX while generating .dat file from large text file

Hello Gurus, We are facing some performance issue in UNIX. If someone had faced such kind of issue in past please provide your suggestions on this . Problem Definition: /Few of load processes of our Finance Application are facing issue in UNIX when they uses a shell script having below... (19 Replies)
Discussion started by: KRAMA
19 Replies

3. UNIX for Advanced & Expert Users

Large volume file formatting

Hi, I have a file which is around 193 gb in size. This file has tonnes of spaces and I need to sanitize it. I tried to use awk script to split this file but it gave me an error like line to long... As of now I am using a sed command to search replace the spaces; however its too slow for such a... (2 Replies)
Discussion started by: darshanw
2 Replies

4. UNIX for Advanced & Expert Users

Gurus needed to diagnose severe performance degradation

Hi everyone, newbie forum poster here. I'm an Oracle DBA and I require some guidance from the Unix gurus here about how to pinpoint where a problem is within a Solaris 9 system running on an 8 CPU Fujitsu server that acts as our Oracle database server. Our sysadmins are trying their best to... (13 Replies)
Discussion started by: DBA_guy
13 Replies

5. HP-UX

Performance issue with 'grep' command for huge file size

I have 2 files; one file (say, details.txt) contains the details of employees and another file (say, emp.txt) has some selected employee names. I am extracting employee details from details.txt by using emp.txt and the corresponding code is: while read line do emp_name=`echo $line` grep -e... (7 Replies)
Discussion started by: arb_1984
7 Replies

6. UNIX for Dummies Questions & Answers

virtual memory and diff'ing very large files

(0 Replies)
Discussion started by: uiop44
0 Replies

7. Programming

Issue when fork()ing processes

Hi guys! I'll simplify my problem. I have the following code: #include <fcntl.h> #include <stdio.h> #include <string.h> #include <stdlib.h> #include <signal.h> #include <fcntl.h> #include <unistd.h> #include <sys/wait.h> #define max 25 #define buffdim 50 void p1(); void p2();... (2 Replies)
Discussion started by: pfpietro
2 Replies

8. UNIX for Dummies Questions & Answers

Large file data handling issue

I have a single record large file, semicolon ';' and pipe '|' separated. I am doing a vi on the file. It is throwing an error "File to long" I need to actually remove the last | symbol from this file. sed -e 's/\|*$//' filename is working fine for small files. But not working on this big... (13 Replies)
Discussion started by: Gurkamal83
13 Replies

9. Shell Programming and Scripting

Performance issue in Grepping large files

I have around 300 files(*.rdf,*.fmb,*.pll,*.ctl,*.sh,*.sql,*.prog) which are of large size. Around 8000 keywords(which will be in the file $keywordfile) needed to be searched inside those files. If a keyword is found in a file..I have to insert the filename,extension,catagoery,keyword,occurrence... (8 Replies)
Discussion started by: millan
8 Replies

10. Shell Programming and Scripting

Output large volume of data to CSV file

I have a program that output the ownership and permission on each directory and file on the server to a csv file. I am getting error message when I run the program. The program is not outputting to the csv file. Error: the file access permissions do not allow the specified action cannot... (2 Replies)
Discussion started by: dellanicholson
2 Replies
All times are GMT -4. The time now is 10:00 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy