06-10-2010
We have a similar problem. Are you running diff? That would take forever.
Use something that has associative (hashed) arrays like awk or perl. Assuming you have several files, and an "old" one and a "new" one, that should take less than an hour.
You can search here for examples of both types of code on how to find file differences.
You need a lot of virtual memory, we run on a Solaris 9 sparc v440 with 32GB of memory.
We complete comparing 1.5GB (250K lines) files in about 5 minutes. We do them 12 at a time: 6 old vs 6 new.
I hope this is what you were asking....
This User Gave Thanks to jim mcnamara For This Post:
5 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
- I am looking for different kind of awk solution which I don't think is mentioned before in these forums.
Number of rows in the file are fixed
Their are two columns in file1.txt
1 1
2 2
3 3
4 4
5 5
6 6
7 7
8 8
9 9
10 10
I am looking for 3... (1 Reply)
Discussion started by: softwarekids23
1 Replies
2. Shell Programming and Scripting
I have a txt file with several columns and i want to peform an operation on two columns and output it to a new txt file .
file.txt
900.00000 1 1 1
500.00000
500.00000
100000.000
4
4
1.45257346E-07 899.10834 ... (4 Replies)
Discussion started by: shashi792
4 Replies
3. Homework & Coursework Questions
Hi all! I need help to do a few things with a .txt file using egrep.
1. I need to list all sequences where the vowel letters 'a, e, i, o, u' occur in that order, possibly separated by characters other than a, e, i, o, u; consisting of one or more complete words, possibly including punctuation.
... (1 Reply)
Discussion started by: dindiqotu
1 Replies
4. Shell Programming and Scripting
I have a script that I am using to copy around 40-70k files to a NFS NAS.
I have posted my code below in hopes that someone can help me figure out a faster way of achieving this.
At the end of the script i need to have all the files in the list, copied over to the nas with source directory... (8 Replies)
Discussion started by: nitrobass24
8 Replies
5. Shell Programming and Scripting
friends
good morning
FTP works perfect but I have a doubt
if I want to transport 10 files, I imagine that I should not open 10 connections as I can transfer more than 1 file?
ftp -n <<!EOF
open caburga
user ephfact ephfact
cd /users/efactura/docONE/entrada
bin
mput EPH`date... (16 Replies)
Discussion started by: tricampeon81
16 Replies
LEARN ABOUT CENTOS
gethugepagesizes
GETHUGEPAGESIZES(3) Library Functions Manual GETHUGEPAGESIZES(3)
NAME
gethugepagesizes - Get the system supported huge page sizes
SYNOPSIS
#include <hugetlbfs.h>
int gethugepagesizes(long pagesizes[], int n_elem);
DESCRIPTION
The gethugepagesizes() function returns either the number of system supported huge page sizes or the sizes themselves. If pagesizes is
NULL and n_elem is 0, then the number of huge pages the system supports is returned. Otherwise, pagesizes is filled with at most n_elem
page sizes.
RETURN VALUE
On success, either the number of huge page sizes supported by the system or the number of huge page sizes stored in pagesizes is returned.
On failure, -1 is returned and errno is set appropriately.
ERRORS
EINVAL n_elem is less than zero or n_elem is greater than zero and pagesizes is NULL.
Also see opendir(3) for other possible values for errno. This error occurs when the sysfs directory exists but cannot be opened.
NOTES
This call will return all huge page sizes as reported by the kernel. Not all of these sizes may be usable by the programmer since mount
points may not be available for all sizes. To test whether a size will be usable by libhugetlbfs, hugetlbfs_find_path_for_size() can be
called on a specific size to see if a mount point is configured.
SEE ALSO
oprofile(1), opendir(3), hugetlbfs_find_path_for_size(3), libhugetlbfs(7)
AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list.
October 10, 2008 GETHUGEPAGESIZES(3)