Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Speeding/Optimizing GREP search on CSV files Post 302450976 by matrixmadhan on Sunday 5th of September 2010 06:03:16 AM
Old 09-05-2010
Quote:
Originally Posted by Whit3H0rse
Yes you are right above example is not much faster (if any) :/

@matrixmadhan how tell perl or grep search to stop searching after 13 field and skip to next line, example ? (oneliner ofc)
Frankly I don't know whether such tools or programs exist. Best thing is to modify grep code that suits your needs, it shouldn't be that difficult.
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Speeding up a Shell Script (find, grep and a for loop)

Hi all, I'm having some trouble with a shell script that I have put together to search our web pages for links to PDFs. The first thing I did was: ls -R | grep .pdf > /tmp/dave_pdfs.outWhich generates a list of all of the PDFs on the server. For the sake of arguement, say it looks like... (8 Replies)
Discussion started by: Dave Stockdale
8 Replies

2. Shell Programming and Scripting

grep'ing and sed'ing chunks in bash... need help on speeding up a log parser.

I have a file that is 20 - 80+ MB in size that is a certain type of log file. It logs one of our processes and this process is multi-threaded. Therefore the log file is kind of a mess. Here's an example: The logfile looks like: "DATE TIME - THREAD ID - Details", and a new file is created... (4 Replies)
Discussion started by: elinenbe
4 Replies

3. UNIX for Dummies Questions & Answers

Using grep to search within files

Hi, At my company, we have custom web sites that we create for different clients. The folder structure is something like: <project name>/html/web/custom/ The custom folder contains a file called "category.html" Every project has the same folder structure, and same file names but, the data... (2 Replies)
Discussion started by: miklo
2 Replies

4. UNIX for Dummies Questions & Answers

Reading compressed files during a grep search

All, The bottom line is that im reading a file, storing it as variables, recursively grep searching it, and then piping it to allow word counts as well. I am unsure on how to open any .zip .tar and .gzip, search for keywords and return results. Any help would be much appreciated! Thanks (6 Replies)
Discussion started by: ryan.lee
6 Replies

5. UNIX for Dummies Questions & Answers

pattern search using grep in specific range of files

Hi, I am trying to do the following: grep -l <pattern> <files to be searched for> In <files to be searched for> , all files should of some specific date like "Apr 8" not all files in current directory. I just to search within files Apr 8 files so that it won't search in entire list of... (2 Replies)
Discussion started by: apjneeraj
2 Replies

6. Shell Programming and Scripting

Perl search csv fileA where two strings exist on another csv fileB

Hi I have two csv files, with the following formats: FileA.log: Application, This occured blah Application, That occured blah Application, Also this AnotherLog, Bob did this AnotherLog, Dave did that FileB.log: Uk, London, Application, datetime, LaterDateTime, Today it had'nt... (8 Replies)
Discussion started by: PerlNewbRP
8 Replies

7. Shell Programming and Scripting

Speeding up search and replace in a for loop

Hello, I am using sed in a for loop to replace text in a 100MB file. I have about 55,000 entries to convert in a csv file with two entries per line. The following script works to search file.txt for the first field from conversion.csv and then replace it with the second field. While it works fine,... (15 Replies)
Discussion started by: pbluescript
15 Replies

8. Shell Programming and Scripting

awk read column csv and search in other csv

hi, someone to know how can i read a specific column of csv file and search the value in other csv columns if exist the value in the second csv copy entire row with all field in a new csv file. i suppose that its possible using awk but i m not expertise thanks in advance (8 Replies)
Discussion started by: giankan
8 Replies

9. Shell Programming and Scripting

Optimizing search using grep

I have a huge log file close to 3GB in size. My task is to generate some reporting based on # of times something is being logged. I need to find the number of time StringA , StringB , StringC is being called separately. What I am doing right now is: grep "StringA" server.log | wc -l... (4 Replies)
Discussion started by: Junaid Subhani
4 Replies

10. Shell Programming and Scripting

Speeding up shell script with grep

HI Guys hoping some one can help I have two files on both containing uk phone numbers master is a file which has been collated over a few years ad currently contains around 4 million numbers new is a file which also contains 4 million number i need to split new nto two separate files... (4 Replies)
Discussion started by: dunryc
4 Replies
MONGOEXPORT(1)							  Mongo Database						    MONGOEXPORT(1)

NAME
mongoexport - the Mongo export tool SYNOPSIS
mongoexport [OPTIONS] DESCRIPTION
mongoexport is a tool to export a MongoDB collection to either JSON or CSV. The query can be filtered or a list of fields to output can be given. If the output is CSV, the fields must be specified in order. EXAMPLES
mongoexport -d test -c test1 --csv -f name,num export documents from test.test1 in CSV format OPTIONS
--help show usage information -h, --host HOST server to connect to (default HOST=localhost) -d, --db DATABASE database to use -c, --c COLLECTION collection to use -q, --query QUERY query filter -f, --fields FIELDS comma-separated list of field names --csv export to CSV instead of JSON -o, --out FILE output file, if not specified, stdout is used --dbpath PATH directly access mongod data files in this path, instead of connecting to a mongod instance COPYRIGHT
Copyright 2007-2009 10gen SEE ALSO
For more information, please refer to the MongoDB wiki, available at http://www.mongodb.org. AUTHOR
Kristina Chodorow 10gen June 2009 MONGOEXPORT(1)
All times are GMT -4. The time now is 07:40 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy