Unless you have enough RAM to process such volume of data on the flight, it would be advisable to load all those data into an DBMS (database management system) for further processing.
Otherwise : filter out duplicate lines and save lines that appear only once in file3
(also see the corresponding -T options of sort command (depending on your OS) for using temporary files and/or directory)
Then process file3 (i didn't test it but maybe something like this :
---------- Post updated at 07:38 PM ---------- Previous update was at 07:27 PM ----------
Regarding the file3 generation, you could also give a try to something like
Last edited by ctsgnb; 01-07-2012 at 06:42 AM..
Reason: Code fixed : uniq -u !!!!
Hi,
As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line.
As DIFF command wont work for big files, i tried to use BDIFF instead.
I am getting incorrect... (13 Replies)
Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text.
I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump)
In using HP-UX large servers.
Any advice will... (8 Replies)
Hi i need to compare two fixed length files and produce the differences if any to a seperate file. I have to capture each and every differneces line by line. Ideally my files should not have any differences but if there are any then it should be captured without any miss. Also my files sizes are... (4 Replies)
Hi
I have to write a script to split the huge file into several pieces. The file columns is | pipe delimited. The data sample is as:
6625060|1420215|07308806|N|20100120|5572477081|+0002.79|+0000.00|0004|0001|......... (3 Replies)
Hi, all:
I've got two folders, say, "folder1" and "folder2".
Under each, there are thousands of files.
It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command.
However, if I change the above question a... (1 Reply)
I’m new to Linux script and not sure how to filter out bad records from huge flat files (over 1.3GB each). The delimiter is a semi colon “;”
Here is the sample of 5 lines in the file:
Name1;phone1;address1;city1;state1;zipcode1
Name2;phone2;address2;city2;state2;zipcode2;comment... (7 Replies)
Hi,
I have a Huge 7 GB file which has around 1 million records, i want to split this file into 4 files to contain around 250k messages each.
Please help me as Split command cannot work here as it might miss tags..
Format of the file is as below
<!--###### ###### START-->... (6 Replies)
Hi Friends !!
I am facing a hash total issue while performing over a set of files of huge volume:
Command used:
tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f'
Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
I have 2 large file (.dat) around 70 g, 12 columns but the data not sorted in both the files.. need your inputs in giving the best optimized method/command to achieve this and redirect the not macthing lines to the thrid file ( diff.dat)
File 1 - 15 columns
File 2 - 15 columns
Data is... (9 Replies)
Discussion started by: kartikirans
9 Replies
LEARN ABOUT CENTOS
hugetlbfs_find_path_for_size
HUGETLBFS_FIND_PATH(3) Library Functions Manual HUGETLBFS_FIND_PATH(3)NAME
hugetlbfs_find_path, hugetlbfs_find_path_for_size - Locate an appropriate hugetlbfs mount point
SYNOPSIS
#include <hugetlbfs.h>
const char *hugetlbfs_find_path(void);
const char *hugetlbfs_find_path_for_size(long page_size);
DESCRIPTION
These functions return a pathname for a mounted hugetlbfs filesystem for the appropriate huge page size. For hugetlbfs_find_path, the
default huge page size is used (see gethugepagesize(3)). For hugetlbfs_find_path_for_size, a valid huge page size must be specified (see
gethugepagesizes(3)).
RETURN VALUE
On success, a non-NULL value is returned. On failure, NULL is returned.
SEE ALSO libhugetlbfs(7), gethugepagesize(3), gethugepagesizes(3)AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list.
March 7, 2012 HUGETLBFS_FIND_PATH(3)