Sponsored Content
Operating Systems HP-UX Performance issue with 'grep' command for huge file size Post 302574529 by arb_1984 on Thursday 17th of November 2011 12:24:54 PM
Old 11-17-2011
Performance issue with 'grep' command for huge file size


I have 2 files; one file (say, details.txt) contains the details of employees and another file (say, emp.txt) has some selected employee names. I am extracting employee details from details.txt by using emp.txt and the corresponding code is:
Code:
while read line
do
emp_name=`echo $line`
grep -e $emp_name details.txt >> output.txt
done < emp.txt

Above code is working fine and I am getting expected result. But, this code is taking too much time (I don't have exact time, more than 6 hrs, later on cancelled the script) while the file size is huge. As an example, I have details.txt of around 2.5GB and record count is around 7.5lacs and the emp.txt has 55K employee name. Can you please suggest any other option/ command which will be better to handle such huge file. Thanks.

Last edited by vbe; 11-18-2011 at 09:33 AM.. Reason: attempt to use code tags ( fighting with fonts...)
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Grep matched records from huge file

111111111100000000001111111111 123232323200000010001114545454 232435424200000000001232131212 342354234301000000002323423443 232435424200000000001232131212 2390898994200000000001238908092 This is the record format. From 11th position to 20th position in a record there are 0's occuring,and... (6 Replies)
Discussion started by: mjkreddy
6 Replies

2. Shell Programming and Scripting

performance of shell script ( grep command)

Hi, I have to find out the run time for 40-45 different componets. These components writes in to a genreric log file in a single directory. eg. directory is LOG and the log file name format is generic_log_<process_id>_<date YY_MM_DD_HH_MM_SS>.log i am taking the run time using the time... (3 Replies)
Discussion started by: vikash_k
3 Replies

3. Shell Programming and Scripting

Implement in one line sed or awk having no delimiter and file size is huge

I have file which contains around 5000 lines. The lines are fixed legth but having no delimiter.Each line line contains nearly 3000 characters. I want to delete the lines a> if it starts with 1 and if 576th postion is a digit i,e 0-9 or b> if it starts with 0 or 9(i,e header and footer) ... (4 Replies)
Discussion started by: millan
4 Replies

4. Shell Programming and Scripting

Severe performance issue while 'grep'ing on large volume of data

Background ------------- The Unix flavor can be any amongst Solaris, AIX, HP-UX and Linux. I have below 2 flat files. File-1 ------ Contains 50,000 rows with 2 fields in each row, separated by pipe. Row structure is like Object_Id|Object_Name, as following: 111|XXX 222|YYY 333|ZZZ ... (6 Replies)
Discussion started by: Souvik
6 Replies

5. Shell Programming and Scripting

FTP a huge Size file

Dear All, Good Evening!! I have a requirement to ftp a 220GB backup file to a remote backup server. I wrote a script for this purpose. But it takes more than 8 hours to transfer this file. Is there any other method to do it in less time??? Thanks in Advance!!! ---------- Post updated... (5 Replies)
Discussion started by: Naga06
5 Replies

6. Shell Programming and Scripting

Optimised way for search & replace a value on one line in a very huge file (File Size is 24 GB).

Hi Experts, I had to edit (a particular value) in header line of a very huge file so for that i wanted to search & replace a particular value on a file which was of 24 GB in Size. I managed to do it but it took long time to complete. Can anyone please tell me how can we do it in a optimised... (7 Replies)
Discussion started by: manishkomar007
7 Replies

7. Shell Programming and Scripting

Performance issue while using find command

Hi, I have created a shell script for Server Log Automation Process. I have used find xargs grep command to search the string. for Example, find -name | xargs grep "816995225" > test.txt . Here my problem is, We have lot of records and we want to grep the string... (4 Replies)
Discussion started by: nanthagopal
4 Replies

8. UNIX for Advanced & Expert Users

Performance problem with removing duplicates in a huge file (50+ GB)

I'm trying to remove duplicate data from an input file with unsorted data which is of size >50GB and write the unique records to a new file. I'm trying and already tried out a variety of options posted in similar threads/forums. But no luck so far.. Any suggestions please ? Thanks !! (9 Replies)
Discussion started by: Kannan K
9 Replies

9. UNIX for Dummies Questions & Answers

What is the faster way to grep from huge file?

Hi All, I am new to this forum and this is my first post. My requirement is like to optimize the time taken to grep the file with 40000 lines. There are two files FILEA(40000 lines) FILEB(40000 lines). The requirement is like this, both the file will be in the format below... (11 Replies)
Discussion started by: mad man
11 Replies

10. Shell Programming and Scripting

Performance Issue for a file search command

Hi All, This query is regarding performance improvement of a command. I have a list of IDs in a file (say file1 with single ID column) and file2 has the data rows. I need to get the IDs from file1 and search in file2, matching rows from file2 should be written to a file3. For this... (4 Replies)
Discussion started by: Tanu
4 Replies
dos2unix(1)						      General Commands Manual						       dos2unix(1)

NAME
dos2unix - DOS/MAC to UNIX text file format converter SYNOPSYS
dos2unix [options] [-c convmode] [-o file ...] [-n infile outfile ...] Options: [-hkqV] [--help] [--keepdate] [--quiet] [--version] DESCRIPTION
This manual page documents dos2unix, the program that converts plain text files in DOS/MAC format to UNIX format. OPTIONS
The following options are available: -h --help Print online help. -k --keepdate Keep the date stamp of output file same as input file. -q --quiet Quiet mode. Suppress all warning and messages. -V --version Prints version information. -c --convmode convmode Sets conversion mode. Simulates dos2unix under SunOS. -o --oldfile file ... Old file mode. Convert the file and write output to it. The program default to run in this mode. Wildcard names may be used. -n --newfile infile outfile ... New file mode. Convert the infile and write output to outfile. File names must be given in pairs and wildcard names should NOT be used or you WILL lost your files. EXAMPLES
Get input from stdin and write output to stdout. dos2unix Convert and replace a.txt. Convert and replace b.txt. dos2unix a.txt b.txt dos2unix -o a.txt b.txt Convert and replace a.txt in ASCII conversion mode. Convert and replace b.txt in ISO conversion mode. Convert c.txt from Mac to Unix ascii format. dos2unix a.txt -c iso b.txt dos2unix -c ascii a.txt -c iso b.txt dos2unix -c mac a.txt b.txt Convert and replace a.txt while keeping original date stamp. dos2unix -k a.txt dos2unix -k -o a.txt Convert a.txt and write to e.txt. dos2unix -n a.txt e.txt Convert a.txt and write to e.txt, keep date stamp of e.txt same as a.txt. dos2unix -k -n a.txt e.txt Convert and replace a.txt. Convert b.txt and write to e.txt. dos2unix a.txt -n b.txt e.txt dos2unix -o a.txt -n b.txt e.txt Convert c.txt and write to e.txt. Convert and replace a.txt. Convert and replace b.txt. Convert d.txt and write to f.txt. dos2unix -n c.txt e.txt -o a.txt b.txt -n d.txt f.txt DIAGNOSTICS
BUGS
The program does not work properly under MSDOS in stdio processing mode. If you know why is that so, please tell me. AUTHORS
Benjamin Lin - <blin@socs.uts.edu.au> Bernd Johannes Wuebben (mac2unix mode) <wuebben@kde.org> MISCELLANY
Tested environment: Linux 1.2.0 with GNU C 2.5.8 SunOS 4.1.3 with GNU C 2.6.3 MS-DOS 6.20 with Borland C++ 4.02 Suggestions and bug reports are welcome. SEE ALSO
unix2dos(1) mac2unix(1) 1995.03.31 dos2unix v3.0 dos2unix(1)
All times are GMT -4. The time now is 01:00 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy