Perl: Need help comparing huge files


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Perl: Need help comparing huge files
# 8  
Old 07-13-2012
Perl's only limitation is the amount of memory in your system.

If these 205 million records are 4 or 5 bytes each like you've shown, that might amount to close to a gig of memory. If they're much larger, they probably won't fit into memory on a 32-bit system and other approaches would need to be tried, such as sorting them, so you can tell when a record's absent without having to load every possible record into memory at once...

If your records aren't as you've shown, then nothing we've written for you is likely to work at all anyway. We need to see what you're really dealing with.
# 9  
Old 07-13-2012
I don't know if your memory will be enougth. Try. Otherwise you will need another approach.
# 10  
Old 07-13-2012
Quote:
Originally Posted by Corona688
Perl's only limitation is the amount of memory in your system.

If these 205 million records are 4 or 5 bytes each like you've shown, that might amount to close to a gig of memory. If they're much larger, they probably won't fit into memory on a 32-bit system and other approaches would need to be tried, such as sorting them, so you can tell when a record's absent without having to load every possible record into memory at once...

If your records aren't as you've shown, then nothing we've written for you is likely to work at all anyway. We need to see what you're really dealing with.
It is about 20 bytes in each.
# 11  
Old 07-13-2012
That's about 3.8 gigs of memory. Not going to fit in a 32-bit process.
# 12  
Old 07-13-2012
If you sort your data, however, you can use the comm utility, which does not need to completely load either file into memory. Since the lines are in sorted order, it can tell when a line and when a line is skipped by whether the next line is greater or less or equal...

sort should be smart enough to process in blocks and not run out of memory. Be sure you have enough /tmp/ space, or redirect it to use another folder for temporary files where you have the room. See man sort for details.

Code:
$ sort data1 > data1-s
$ sort data2 > data2-s
$ comm -2 -3 data1-s data2-s > only-data1
$ comm -1 -3 data1-s data2-s > only-data2
$ cat only-data1

1233
4444
7777

$ cat only-data2

1244
9898
9999

$

Note that it might be possible to run comm once to get both sets of data, if only I knew what your data looks like -- which I still don't, after asking several times...
# 13  
Old 07-25-2012
Thank you. Your verison worked.

Quote:
Originally Posted by birei
Try:
Code:
$ cat inputfile1
1233
2345
3456
4444
7777
$ cat inputfile2
1244
2345
3456
9898
9999
$ cat script.pl
use warnings;
use strict;
 
my (%hash);
 
die qq|Usage: $0 <inputfile-1> <inputfile-2> <outputfile-1> <outputfile-2>\n| 
        unless @ARGV == 4;
 
open my $ifh1, q|<|, shift or die;
open my $ifh2, q|<|, shift or die;
open my $ofh1, q|>|, shift or die;
open my $ofh2, q|>|, shift or die;
 
while ( <$ifh1> ) {
        chomp;
        $hash{ $_ } = 1;
}
 
while ( <$ifh2> ) {
        chomp;
        if ( exists $hash{ $_ } ) {
                delete $hash{ $_ };
                next;
        }
 
        printf $ofh2 qq|%d\n|, $_;
}
 
for ( sort { $a <=> $b } keys %hash ) {
        printf $ofh1 qq|%d\n|, $_;
}
$ perl script.pl inputfile1 inputfile2 outputfile1 outputfile2
$ cat outputfile1
1233
4444
7777
$ cat outputfile2
1244
9898
9999

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Need help in comparing two files using shell or Perl

I have these two file that I am trying to compare using shell arrays. I need to find out the changed or the missing enteries from File2. For example. The line "f nsd1" in file2 is different from file1 and the line "g nsd6" is missing from file2. I dont want to use "for loop" because my files... (2 Replies)
Discussion started by: sags007_99
2 Replies

2. Shell Programming and Scripting

Removing Dupes from huge file- awk/perl/uniq

Hi, I have the following command in place nawk -F, '!a++' file > file.uniq It has been working perfectly as per requirements, by removing duplicates by taking into consideration only first 3 fields. Recently it has started giving below error: bash-3.2$ nawk -F, '!a++'... (17 Replies)
Discussion started by: makn
17 Replies

3. Shell Programming and Scripting

Perl: Comparing to two files and displaying the differences

Hi, I'm new to perl and i have to write a perl script that will compare to log/txt files and display the differences. Unfortunately I'm not allowed to use any complied binaries or applications like diff or comm. So far i've across a code like this: use strict; use warnings; my $list1;... (2 Replies)
Discussion started by: dont_be_hasty
2 Replies

4. Shell Programming and Scripting

Comparing 2 huge text files

I have this 2 files: k5login sanwar@systems.nyfix.com jjamnik@systems.nyfix.com nisha@SYSTEMS.NYFIX.COM rdpena@SYSTEMS.NYFIX.COM service/backups-ora@SYSTEMS.NYFIX.COM ivanr@SYSTEMS.NYFIX.COM nasapova@SYSTEMS.NYFIX.COM tpulay@SYSTEMS.NYFIX.COM rsueno@SYSTEMS.NYFIX.COM... (11 Replies)
Discussion started by: linuxgeek
11 Replies

5. Shell Programming and Scripting

Comparing two huge files on field basis.

Hi all, I have two large files and i want a field by field comparison for each record in it. All fields are tab seperated. file1: Email SELVAKUMAR RAMACHANDRAN Email SHILPA SAHU Web NIYATI SONI Web NIYATI SONI Email VIINII DOSHI Web RAJNISH KUMAR Web ... (4 Replies)
Discussion started by: Suman Singh
4 Replies

6. Shell Programming and Scripting

Problem running Perl Script with huge data files

Hello Everyone, I have a perl script that reads two types of data files (txt and XML). These data files are huge and large in number. I am using something like this : foreach my $t (@text) { open TEXT, $t or die "Cannot open $t for reading: $!\n"; while(my $line=<TEXT>){ ... (4 Replies)
Discussion started by: ad23
4 Replies

7. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

8. Shell Programming and Scripting

Perl script error to split huge data one by one.

Below is my perl script: #!/usr/bin/perl open(FILE,"$ARGV") or die "$!"; @DATA = <FILE>; close FILE; $join = join("",@DATA); @array = split( ">",$join); for($i=0;$i<=scalar(@array);$i++){ system ("/home/bin/./program_name_count_length MULTI_sequence_DATA_FILE -d... (5 Replies)
Discussion started by: patrick87
5 Replies

9. Shell Programming and Scripting

Comparing two huge files

Hi, I have two files file A and File B. File A is a error file and File B is source file. In the error file. First line is the actual error and second line gives the information about the record (client ID) that throws error. I need to compare the first field (which doesnt start with '//') of... (11 Replies)
Discussion started by: kmkbuddy_1983
11 Replies

10. UNIX for Dummies Questions & Answers

comparing Huge Files - Performance is very bad

Hi All, Can you please help me in resolving the following problem? My requirement is like this: 1) I have two files YESTERDAY_FILE and TODAY_FILE. Each one is having nearly two million data. 2) I need to check each record of TODAY_FILE in YESTERDAY_FILE. If exists we can skip that by... (5 Replies)
Discussion started by: madhukalyan
5 Replies
Login or Register to Ask a Question