Speeding/Optimizing GREP search on CSV files


 
Thread Tools Search this Thread
Top Forums UNIX for Dummies Questions & Answers Speeding/Optimizing GREP search on CSV files
# 8  
Old 09-05-2010
Can't use that option on my solaris:

Code:
grep: illegal option -- P

# 9  
Old 09-05-2010
Quote:
Originally Posted by Whit3H0rse
Can't use that option on my solaris:

Code:
grep: illegal option -- P

Oops, sorry. That was the option labelled for GNU grep. Check in solaris for a similar option that says " Interpret PATTERN as a Perl regular expression. "
# 10  
Old 09-05-2010
I think you guys are missing one important point... The longest delay is always introduced by disk reads. When file is being read it is opened as a whole, so any operation that is performed after file is opened won't reduce main delay. And I don't think you will be able to open just part of the file you are interested in. If you are really concerned about those 5 minutes searches, or if you think it might get worse in the future (when number of files to be checked increases), then I would suggest you migrating to some database with text indexes (I think PostgreSQL has that feature).
# 11  
Old 09-05-2010
Quote:
Originally Posted by bartus11
When file is being read it is opened as a whole
Can you please explain what do you mean by "opened as a whole"? Not the whole file gets loaded into the primary memory.
Quote:
, so any operation that is performed after file is opened won't reduce main delay.
If its a larger file, it cannot load everything in one shot and it has to go back, do a disk read and fetch it - page failure
Quote:
And I don't think you will be able to open just part of the file you are interested in.
Opening part of the file is not an option here, the OP is interested in searching only part of the record and not part of the file, which means the whole of file ( all the records ) have to be searched but no need to spend time on scanning through the whole record.
# 12  
Old 09-05-2010
I ment reading full lines (records) while file is opened... Let's assume you do have a file with records long enough to be stored in many disk blocks. Lets further assume you can tell system to stop reading disk blocks after certain part of this long record was read. Now how do you tell when next record starts? I remind you that file is a stream of bytes, so next record will start somewhere in next few disk blocks, but where exactly? Smilie
 
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Speeding up shell script with grep

HI Guys hoping some one can help I have two files on both containing uk phone numbers master is a file which has been collated over a few years ad currently contains around 4 million numbers new is a file which also contains 4 million number i need to split new nto two separate files... (4 Replies)
Discussion started by: dunryc
4 Replies

2. Shell Programming and Scripting

Optimizing search using grep

I have a huge log file close to 3GB in size. My task is to generate some reporting based on # of times something is being logged. I need to find the number of time StringA , StringB , StringC is being called separately. What I am doing right now is: grep "StringA" server.log | wc -l... (4 Replies)
Discussion started by: Junaid Subhani
4 Replies

3. Shell Programming and Scripting

awk read column csv and search in other csv

hi, someone to know how can i read a specific column of csv file and search the value in other csv columns if exist the value in the second csv copy entire row with all field in a new csv file. i suppose that its possible using awk but i m not expertise thanks in advance (8 Replies)
Discussion started by: giankan
8 Replies

4. Shell Programming and Scripting

Speeding up search and replace in a for loop

Hello, I am using sed in a for loop to replace text in a 100MB file. I have about 55,000 entries to convert in a csv file with two entries per line. The following script works to search file.txt for the first field from conversion.csv and then replace it with the second field. While it works fine,... (15 Replies)
Discussion started by: pbluescript
15 Replies

5. Shell Programming and Scripting

Perl search csv fileA where two strings exist on another csv fileB

Hi I have two csv files, with the following formats: FileA.log: Application, This occured blah Application, That occured blah Application, Also this AnotherLog, Bob did this AnotherLog, Dave did that FileB.log: Uk, London, Application, datetime, LaterDateTime, Today it had'nt... (8 Replies)
Discussion started by: PerlNewbRP
8 Replies

6. UNIX for Dummies Questions & Answers

pattern search using grep in specific range of files

Hi, I am trying to do the following: grep -l <pattern> <files to be searched for> In <files to be searched for> , all files should of some specific date like "Apr 8" not all files in current directory. I just to search within files Apr 8 files so that it won't search in entire list of... (2 Replies)
Discussion started by: apjneeraj
2 Replies

7. UNIX for Dummies Questions & Answers

Reading compressed files during a grep search

All, The bottom line is that im reading a file, storing it as variables, recursively grep searching it, and then piping it to allow word counts as well. I am unsure on how to open any .zip .tar and .gzip, search for keywords and return results. Any help would be much appreciated! Thanks (6 Replies)
Discussion started by: ryan.lee
6 Replies

8. UNIX for Dummies Questions & Answers

Using grep to search within files

Hi, At my company, we have custom web sites that we create for different clients. The folder structure is something like: <project name>/html/web/custom/ The custom folder contains a file called "category.html" Every project has the same folder structure, and same file names but, the data... (2 Replies)
Discussion started by: miklo
2 Replies

9. Shell Programming and Scripting

grep'ing and sed'ing chunks in bash... need help on speeding up a log parser.

I have a file that is 20 - 80+ MB in size that is a certain type of log file. It logs one of our processes and this process is multi-threaded. Therefore the log file is kind of a mess. Here's an example: The logfile looks like: "DATE TIME - THREAD ID - Details", and a new file is created... (4 Replies)
Discussion started by: elinenbe
4 Replies

10. UNIX for Dummies Questions & Answers

Speeding up a Shell Script (find, grep and a for loop)

Hi all, I'm having some trouble with a shell script that I have put together to search our web pages for links to PDFs. The first thing I did was: ls -R | grep .pdf > /tmp/dave_pdfs.outWhich generates a list of all of the PDFs on the server. For the sake of arguement, say it looks like... (8 Replies)
Discussion started by: Dave Stockdale
8 Replies
Login or Register to Ask a Question