Problem with parsing a large file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Problem with parsing a large file
# 1  
Old 10-18-2006
Problem with parsing a large file

Hi All,

Following is the sample file

Quote:
91,1
91,2
91,4
91,3
81,2
81,3
81,1
and following is the op desired

Quote:
81,1
91,3
that is the last entry of each unique first field is required.

My solution is as follows

Quote:
cut -d "," -f1 rcont | sort -u > 1

$cat 1
81
91

$for var in `cat 1`
> do
> grep $var rcont | tail -1
> done
81,1
91,3
However the original file has around a million entries and around a 100,000 uniques first fields, so this soln. will take damn long time to execute.

Is there a better and faster way of doing it

Regards,
Gaurav
# 2  
Old 10-18-2006
Code:
nawk -F',' {a[$1]=$0} END { for (i in a) print a[i]}' myFile.txt | sort -n

# 3  
Old 10-18-2006
awesome vgersh, can you please explain the command
# 4  
Old 10-18-2006
Quote:
Originally Posted by vgersh99
Code:
nawk -F',' {a[$1]=$0} END { for (i in a) print a[i]}' myFile.txt | sort -n

using awk's assiciative arrays... read in the file, for every record/line create/update the assiciative arrave indexed by the value of first field with the value of the record/line itself. The last update of A entry in an array will done for the LAST record for a given index [FIRST filed in a record/line].

After processing ALL the records/lines in a file ['END' block of 'awk']... iterate through the previously populated array 'a' indexed by the iterator 'i' [the first field in the original file and output the value for a given index [the original record/line] in a file.

Because the final array iteration does not guarantee the ORDER of the entries, 'sort -n' the output - the sorting is numeric and is done based on the 'FIRST' column.

Hope it's clearER Smilie
# 5  
Old 10-18-2006
Thanks vgersh for your timely help it will definitely save a lot of time
# 6  
Old 10-18-2006
Hi Vgersh,

How can we search and replace a pattern in a file with out opening it and also with replacing it.

Thanks, Smilie
Sam.
# 7  
Old 10-18-2006
Quote:
Originally Posted by sbasetty
Hi Vgersh,

How can we search and replace a pattern in a file with out opening it and also with replacing it.

Thanks, Smilie
Sam.
First, pls open a NEW thread.
Second, define what you mean by 'without openining a file'?

Last edited by vgersh99; 10-19-2006 at 11:11 AM..
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Parsing large files in Solaris 11

I have a 1.2G file that contains no newline characters. This is essentially a log file with each entry being exactly 78bits long. The basic format is /DATE/USER/MISC/. The single uniform thing about the file is that that the 8 character is always ":" I worked with smaller files of the same... (8 Replies)
Discussion started by: os2mac
8 Replies

2. UNIX for Advanced & Expert Users

problem while doing Large file transfer thru Scp and FTP

Hi , I want to transfer one file having 6GB(after compression) which is in .cpk format from one server to other server. I tried scp command as well as FTP and also split the file then transfer the files thru scp command. At last i am facing the data lost and connection lost issue. Generally it... (2 Replies)
Discussion started by: Sumit sarangi
2 Replies

3. UNIX for Dummies Questions & Answers

Large file problem

I have a large file, around 570 gb that I want to copy to tape. However, my tape drive will load only up to 500 gb. I don't have enough space on disk to compress it before copying to tape. Can I compress and tar to tape in one command without writing a compressed disk file? Any suggestions... (8 Replies)
Discussion started by: iancrozier
8 Replies

4. UNIX for Dummies Questions & Answers

Have problem transfer large file bigger 1GB

Hi folks, I have a big problem.... and need help from your experience/knowledge. I previously install and use FREEBSD 7.0 release on my storage/backup file server, for some reason, I can not transfer any files that is bigger than 1GB. If I transfer it to Freebsd file server, the system... (2 Replies)
Discussion started by: bsdme2
2 Replies

5. Shell Programming and Scripting

parsing large CDR XML file

Dear Freind in the file attached how parse the data to be like a normal table :D (3 Replies)
Discussion started by: saifsafaa
3 Replies

6. UNIX for Advanced & Expert Users

Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's... (3 Replies)
Discussion started by: rprajendran
3 Replies

7. Shell Programming and Scripting

Parsing a large log

I need to parse a large log say 300-400 mb The commands like awk and cat etc are taking time. Please help how to process. I need to process the log for certain values of current date. But I am unbale to do so. (17 Replies)
Discussion started by: asth
17 Replies

8. UNIX for Dummies Questions & Answers

problem while making ftp of a large file

Hi Friends, I'mfacing a problem while doing ftp of a large file.The control session is getting closed after sometime.But data session transfers the file successfully even when the control seeion is lost.I need to make the control session available as long as data session is active. How can i... (1 Reply)
Discussion started by: rprajendran
1 Replies

9. Shell Programming and Scripting

Problem in processing a very large file.

Hi Friends, Getting an error while processing a very large file using an sqlloader........ The file is larger than 2 GB. Now need to change the compiler to 64-bit so that the file can be processed. Is there any command for the same. Thanks in advance. (1 Reply)
Discussion started by: Rohini Vijay
1 Replies

10. UNIX for Advanced & Expert Users

Large file transfer problem

Hello Everyone, I can't transfer a large file (~15GB TAR Archive) from one linux machine to another via FTP. I have tried the following: 1) Normal FTP the whole 15GB. This stops when it gets to about 2GB and doesn't go any further. 2) Split the 15GB file into 500MB pieces using the... (1 Reply)
Discussion started by: VVV
1 Replies
Login or Register to Ask a Question