Sponsored Content
Full Discussion: Comparing two huge files
Top Forums Shell Programming and Scripting Comparing two huge files Post 302231726 by RahulJoshi on Wednesday 3rd of September 2008 03:44:07 AM
Old 09-03-2008
for compare:
comm file1 file2

for diffrence:
diff file1 file2
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

comparing Huge Files - Performance is very bad

Hi All, Can you please help me in resolving the following problem? My requirement is like this: 1) I have two files YESTERDAY_FILE and TODAY_FILE. Each one is having nearly two million data. 2) I need to check each record of TODAY_FILE in YESTERDAY_FILE. If exists we can skip that by... (5 Replies)
Discussion started by: madhukalyan
5 Replies

2. UNIX for Dummies Questions & Answers

Difference between two huge files

Hi, As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line. As DIFF command wont work for big files, i tried to use BDIFF instead. I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies

3. UNIX for Advanced & Expert Users

Huge files manipulation

Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text. I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump) In using HP-UX large servers. Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies

4. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

5. Shell Programming and Scripting

Comparing two huge files on field basis.

Hi all, I have two large files and i want a field by field comparison for each record in it. All fields are tab seperated. file1: Email SELVAKUMAR RAMACHANDRAN Email SHILPA SAHU Web NIYATI SONI Web NIYATI SONI Email VIINII DOSHI Web RAJNISH KUMAR Web ... (4 Replies)
Discussion started by: Suman Singh
4 Replies

6. Shell Programming and Scripting

Comparing 2 huge text files

I have this 2 files: k5login sanwar@systems.nyfix.com jjamnik@systems.nyfix.com nisha@SYSTEMS.NYFIX.COM rdpena@SYSTEMS.NYFIX.COM service/backups-ora@SYSTEMS.NYFIX.COM ivanr@SYSTEMS.NYFIX.COM nasapova@SYSTEMS.NYFIX.COM tpulay@SYSTEMS.NYFIX.COM rsueno@SYSTEMS.NYFIX.COM... (11 Replies)
Discussion started by: linuxgeek
11 Replies

7. Shell Programming and Scripting

Perl: Need help comparing huge files

What do i need to do have the below perl program load 205 million record files into the hash. It currently works on smaller files, but not working on huge files. Any idea what i need to do to modify to make it work with huge files: #!/usr/bin/perl $ot1=$ARGV; $ot2=$ARGV; open(mfileot1,... (12 Replies)
Discussion started by: mrn6430
12 Replies

8. Shell Programming and Scripting

awk to parse huge files

Hello All, I have a situation as below: (1) Read a source file (a single file of 1.2 million rows in it ) (2) Read Destination files one by one and replace the content ( few fields in it ) with the corresponding matching field from source file. I tried as below: ( please note I am not... (4 Replies)
Discussion started by: panyam
4 Replies

9. Shell Programming and Scripting

Work with huge Zipped files

Hello dear members, I have one general and one specific question which I will be very grateful if you could help me with them. Let's start with my general question: 1. I am working on cluster computer shared with other people and I need to manipulate a big zipped text file of 13 GB. There is... (1 Reply)
Discussion started by: Homa
1 Replies

10. Shell Programming and Scripting

Aggregation of Huge files

Hi Friends !! I am facing a hash total issue while performing over a set of files of huge volume: Command used: tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f' Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies
File::Find::Rule::Procedural(3) 			User Contributed Perl Documentation			   File::Find::Rule::Procedural(3)

NAME
File::Find::Rule::Procedural - File::Find::Rule's procedural interface SYNOPSIS
use File::Find::Rule; # find all .pm files, procedurally my @files = find(file => name => '*.pm', in => @INC); DESCRIPTION
In addition to the regular object-oriented interface, File::Find::Rule provides two subroutines for you to use. "find( @clauses )" "rule( @clauses )" "find" and "rule" can be used to invoke any methods available to the OO version. "rule" is a synonym for "find" Passing more than one value to a clause is done with an anonymous array: my $finder = find( name => [ '*.mp3', '*.ogg' ] ); "find" and "rule" both return a File::Find::Rule instance, unless one of the arguments is "in", in which case it returns a list of things that match the rule. my @files = find( name => [ '*.mp3', '*.ogg' ], in => $ENV{HOME} ); Please note that "in" will be the last clause evaluated, and so this code will search for mp3s regardless of size. my @files = find( name => '*.mp3', in => $ENV{HOME}, size => '<2k' ); ^ | Clause processing stopped here ------/ It is also possible to invert a single rule by prefixing it with "!" like so: # large files that aren't videos my @files = find( file => '!name' => [ '*.avi', '*.mov' ], size => '>20M', in => $ENV{HOME} ); AUTHOR
Richard Clamp <richardc@unixbeard.net> COPYRIGHT
Copyright (C) 2003 Richard Clamp. All Rights Reserved. This module is free software; you can redistribute it and/or modify it under the same terms as Perl itself. SEE ALSO
File::Find::Rule perl v5.18.2 2011-09-19 File::Find::Rule::Procedural(3)
All times are GMT -4. The time now is 10:59 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy