Grep -f for big files


Login or Register for Dates, Times and to Reply

 
Thread Tools Search this Thread
Top Forums UNIX for Beginners Questions & Answers Grep -f for big files
# 1  
Grep -f for big files

ok guys.
this isnt homework or anything.
i have been using grep -f all my life but i am trying this for a huge file and it doesnt work.

can someone give me a replacement for grep -f pattern file for big files?

thanks
# 2  
Quote:
Originally Posted by ahfze
ok guys.
this isnt homework or anything.
i have been using grep -f all my life but i am trying this for a huge file and it doesnt work.

can someone give me a replacement for grep -f pattern file for big files?

thanks
Please define/describe doesn't work.
What is it that you're trying to do exactly and in what way it "fails"?
These 2 Users Gave Thanks to vgersh99 For This Post:
# 3  
im trying to grep -f patternfile file
both patternfile and file are huge so grep doesnt work here.
ive been reading up that awk is the way out here.

i need a solution for this.
thanks
# 4  
Quote:
Originally Posted by ahfze
im trying to grep -f patternfile file
both patternfile and file are huge so grep doesnt work here.
ive been reading up that awk is the way out here.

i need a solution for this.
thanks
Don't think we can help you unless you answer the previous question in the detailed way.
Thanks
# 5  
im trying to
Code:
grep -f patternfile file

Code:
both patternfile and file are huge so grep doesnt work here.

ive been reading up that awk is the way out here.

Can someone help me with an awk which does the same job as grep -f?
thanks
# 6  
Can you post sample patternfile and input file?
# 7  
hi guys.
ok so im working with a huge pattern file Q.
i am trying to split the pattern file Q into 50 files and saving output of grep in file1,file2...etc
the problem is that the output keeps saving in file0.its not saving in file1,file2 etc
can someone help?

i wrote these codes.
Code:
let i=0
while read line; do
split -l 50 Q Q.split.
for CHUNK in Q.split.* ; do
        grep -f "$CHUNK" MEGA-CNN-AND-LINKDIN >"file_$i"
  let i+=1
done < "$i"
  done 
done

Login or Register for Dates, Times and to Reply

Previous Thread | Next Thread
Thread Tools Search this Thread
Search this Thread:
Advanced Search

Test Your Knowledge in Computers #201
Difficulty: Easy
Perl is implemented as a core interpreter, written in C, together with a large collection of modules, written in Perl and C.
True or False?

8 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Need help for faster file read and grep in big files

I have a very big input file <inputFile1.txt> which has list of mobile no inputFile1.txt 3434343 3434323 0970978 85233 ... around 1 million records i have another file as inputFile2.txt which has some log detail big file inputFile2.txt afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)
Discussion started by: reldb
3 Replies

2. UNIX for Dummies Questions & Answers

Copy Files From a big list

Requirement: When I do ls -ltr /home/data/orders I get a huge list of files, I need to copy that last 50 to another directory say /home/work/ later, I will do my ETL process and then again I need to copy from 51 to 100 and so on. What is the command to copy files specifying 1 to 50... (5 Replies)
Discussion started by: eskay
5 Replies

3. UNIX for Advanced & Expert Users

Split a big file into two others files

Hello, i have a very big file that has more then 80 MBytes (100MBytes). So with my CVS Application I cannot commit this file (too Big) because it must have < 80 MBytes. How can I split this file into two others files, i think the AIX Unix command : split -b can do that, buit how is the right... (2 Replies)
Discussion started by: steiner
2 Replies

4. Linux

Strings does not work for big files

I was trying to calculate DBID of oracle database according to the topic Oracle in World: How to Discover find DBID and following number 2) mechanism specified there that is using of strings keyword. My unfortunately my oracle database datafile is so big that I could not use strings keyword for... (1 Reply)
Discussion started by: synthea
1 Replies

5. Shell Programming and Scripting

awk with really big files

Hi, I have a text file that is around 7Gb which is basically a matrix of numbers (FS is a space and RS is \n). I need the most efficient way of plucking out a number from a specified row and column in the file. For example, for the value at row 15983, col 26332, I'm currently I'm using: ... (1 Reply)
Discussion started by: Jonny2Vests
1 Replies

6. Shell Programming and Scripting

sed of big html files

hi friends, i have to cut a large html file between tag " <!-- DEFACEMENTS ROWS -->" "<!-- DISCLAIMER FOOTER -->" and store cut data in other file please help me!!!! (2 Replies)
Discussion started by: praneshbmishra
2 Replies

7. Shell Programming and Scripting

Big data file - sed/grep/awk?

Morning guys. Another day another question. :rolleyes: I am knocking up a script to pull some data from a file. The problem is the file is very big (up to 1 gig in size), so this solution: for results in `grep "^\ ... works, but takes ages (we're talking minutes) to run. The data is held... (8 Replies)
Discussion started by: dlam
8 Replies

8. Linux

Problem with creating big files

Hi... I have a very wired problem with my redhat4 update 4 server... Every time i create a file bigger then my physical memory the server kills the process\session that creates the file, and in the "messages" file i see this error: "Oct 21 15:22:22 optidev kernel: Out of Memory: Killed process... (6 Replies)
Discussion started by: eliraza6
6 Replies

Featured Tech Videos