Grep -f for big files


 
Thread Tools Search this Thread
Top Forums UNIX for Beginners Questions & Answers Grep -f for big files
# 1  
Old 05-30-2019
Grep -f for big files

ok guys.
this isnt homework or anything.
i have been using grep -f all my life but i am trying this for a huge file and it doesnt work.

can someone give me a replacement for grep -f pattern file for big files?

thanks
# 2  
Old 05-30-2019
Quote:
Originally Posted by ahfze
ok guys.
this isnt homework or anything.
i have been using grep -f all my life but i am trying this for a huge file and it doesnt work.

can someone give me a replacement for grep -f pattern file for big files?

thanks
Please define/describe doesn't work.
What is it that you're trying to do exactly and in what way it "fails"?
These 2 Users Gave Thanks to vgersh99 For This Post:
# 3  
Old 05-30-2019
im trying to grep -f patternfile file
both patternfile and file are huge so grep doesnt work here.
ive been reading up that awk is the way out here.

i need a solution for this.
thanks
# 4  
Old 05-30-2019
Quote:
Originally Posted by ahfze
im trying to grep -f patternfile file
both patternfile and file are huge so grep doesnt work here.
ive been reading up that awk is the way out here.

i need a solution for this.
thanks
Don't think we can help you unless you answer the previous question in the detailed way.
Thanks
# 5  
Old 05-30-2019
im trying to
Code:
grep -f patternfile file

Code:
both patternfile and file are huge so grep doesnt work here.

ive been reading up that awk is the way out here.

Can someone help me with an awk which does the same job as grep -f?
thanks
# 6  
Old 05-31-2019
Can you post sample patternfile and input file?
# 7  
Old 06-01-2019
hi guys.
ok so im working with a huge pattern file Q.
i am trying to split the pattern file Q into 50 files and saving output of grep in file1,file2...etc
the problem is that the output keeps saving in file0.its not saving in file1,file2 etc
can someone help?

i wrote these codes.
Code:
let i=0
while read line; do
split -l 50 Q Q.split.
for CHUNK in Q.split.* ; do
        grep -f "$CHUNK" MEGA-CNN-AND-LINKDIN >"file_$i"
  let i+=1
done < "$i"
  done 
done

Login or Register to Ask a Question

Previous Thread | Next Thread

8 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Need help for faster file read and grep in big files

I have a very big input file <inputFile1.txt> which has list of mobile no inputFile1.txt 3434343 3434323 0970978 85233 ... around 1 million records i have another file as inputFile2.txt which has some log detail big file inputFile2.txt afjhjdhfkjdhfkd df h8983 3434343 | 3483 | myout1 |... (3 Replies)
Discussion started by: reldb
3 Replies

2. UNIX for Dummies Questions & Answers

Copy Files From a big list

Requirement: When I do ls -ltr /home/data/orders I get a huge list of files, I need to copy that last 50 to another directory say /home/work/ later, I will do my ETL process and then again I need to copy from 51 to 100 and so on. What is the command to copy files specifying 1 to 50... (5 Replies)
Discussion started by: eskay
5 Replies

3. UNIX for Advanced & Expert Users

Split a big file into two others files

Hello, i have a very big file that has more then 80 MBytes (100MBytes). So with my CVS Application I cannot commit this file (too Big) because it must have < 80 MBytes. How can I split this file into two others files, i think the AIX Unix command : split -b can do that, buit how is the right... (2 Replies)
Discussion started by: steiner
2 Replies

4. Linux

Strings does not work for big files

I was trying to calculate DBID of oracle database according to the topic Oracle in World: How to Discover find DBID and following number 2) mechanism specified there that is using of strings keyword. My unfortunately my oracle database datafile is so big that I could not use strings keyword for... (1 Reply)
Discussion started by: synthea
1 Replies

5. Shell Programming and Scripting

awk with really big files

Hi, I have a text file that is around 7Gb which is basically a matrix of numbers (FS is a space and RS is \n). I need the most efficient way of plucking out a number from a specified row and column in the file. For example, for the value at row 15983, col 26332, I'm currently I'm using: ... (1 Reply)
Discussion started by: Jonny2Vests
1 Replies

6. Shell Programming and Scripting

sed of big html files

hi friends, i have to cut a large html file between tag " <!-- DEFACEMENTS ROWS -->" "<!-- DISCLAIMER FOOTER -->" and store cut data in other file please help me!!!! (2 Replies)
Discussion started by: praneshbmishra
2 Replies

7. Shell Programming and Scripting

Big data file - sed/grep/awk?

Morning guys. Another day another question. :rolleyes: I am knocking up a script to pull some data from a file. The problem is the file is very big (up to 1 gig in size), so this solution: for results in `grep "^\ ... works, but takes ages (we're talking minutes) to run. The data is held... (8 Replies)
Discussion started by: dlam
8 Replies

8. UNIX for Dummies Questions & Answers

Archiving big ammount of files.

Hello All. I have problem archiving files. The problem is:) I have about 10000 files in one directory, all this file aproximately the same size, i need to gzip them and write on DVD. But all this files take about 15 GB of space (already gzipped). So i need DVD Blue-Ray :p or i need to split... (3 Replies)
Discussion started by: Maxeg
3 Replies
Login or Register to Ask a Question