GETHUGEPAGESIZES(3) Library Functions Manual GETHUGEPAGESIZES(3)NAME
gethugepagesizes - Get the system supported huge page sizes
SYNOPSIS
#include <hugetlbfs.h>
int gethugepagesizes(long pagesizes[], int n_elem);
DESCRIPTION
The gethugepagesizes() function returns either the number of system supported huge page sizes or the sizes themselves. If pagesizes is
NULL and n_elem is 0, then the number of huge pages the system supports is returned. Otherwise, pagesizes is filled with at most n_elem
page sizes.
RETURN VALUE
On success, either the number of huge page sizes supported by the system or the number of huge page sizes stored in pagesizes is returned.
On failure, -1 is returned and errno is set appropriately.
ERRORS
EINVAL n_elem is less than zero or n_elem is greater than zero and pagesizes is NULL.
Also see opendir(3) for other possible values for errno. This error occurs when the sysfs directory exists but cannot be opened.
NOTES
This call will return all huge page sizes as reported by the kernel. Not all of these sizes may be usable by the programmer since mount
points may not be available for all sizes. To test whether a size will be usable by libhugetlbfs, hugetlbfs_find_path_for_size() can be
called on a specific size to see if a mount point is configured.
SEE ALSO oprofile(1), opendir(3), hugetlbfs_find_path_for_size(3), libhugetlbfs(7)AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list.
October 10, 2008 GETHUGEPAGESIZES(3)
Check Out this Related Man Page
HUGETLBFS_UNLINKED_FD(3) Library Functions Manual HUGETLBFS_UNLINKED_FD(3)NAME
hugetlbfs_unlinked_fd, hugetlbfs_unlinked_fd_for_size - Obtain a file descriptor for a new unlinked file in hugetlbfs
SYNOPSIS
#include <hugetlbfs.h>
int hugetlbfs_unlinked_fd(void);
int hugetlbfs_unlinked_fd_for_size(long page_size);
DESCRIPTION
These functions return an open file descriptor for a unique, newly-created file in a hugetlbfs filesystem. To avoid leaking hugepages, the
file is unlinked automatically before the function returns.
For hugetlbfs_unlinked_fd, the default huge page size is used (see gethugepagesize(3)). For hugetlbfs_unlinked_fd_for_size, a valid huge
page size must be specified (see gethugepagesizes(3)).
RETURN VALUE
On success, a valid open file descriptor is returned. On failure, -1 is returned and errno may be set appropriately.
SEE ALSO gethugepagesize(3), gethugepagesizes(3), mkstemp(3), libhugetlbfs(7)AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list.
March 7, 2012 HUGETLBFS_UNLINKED_FD(3)
hi there guys,
wonder if any gurus can help me out on this one...
try searching the past threads but cant find anything.
i have this huge file but when i use vi to open it it gives me the following error:
<"pmrepserver.txt""/var/tmp/Ex86200" There is not enough space in the file... (7 Replies)
I have a huge file and want to separate it into several subsets.
The file looks like:
C1 C2 C3 C4 ... (variable names)
1 ....
2 ....
3 ....
:
22 ....
23 ....
I want to separate the huge file using the column 1, which has numbers from 1 to 23 (but there are different amount of... (8 Replies)
I have few AIX 5.3 boxes where following is the issue.
I have a variable whose value is a very huge string ...(5000+ characters)
CMD_ARGS="/global/site/vendor/WAS/WebSphere6/AppServer/java/bin/java... (7 Replies)
Hi
We have 50 million records in mainframes DB2. We have a requirement to Record the Change Data Capture(CDC) records.
i.e New Records or Updated Records that were added into the DB2.
Unfortunately we dont have any column indicators to give the details of the changes made to the records.
... (8 Replies)
Dear Guy’s
By using dd command or any strong command, I’d like to copy huge data from file system to another file system
Sours File system: /sfsapp
File system has 250 GB of data
Target File system: /tgtapp
I’d like to copy all these files and directories from /sfsapp to /tgtapp as... (28 Replies)
Hi All,
HP-UX dev4 B.11.11 U 9000/800 3251073457
I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this?
... (18 Replies)
Hi Ppl,
I have a requirement like i will be getting files of huge size daily and if the file size is so huge ,the files will be split into many parts and sent.The first file will have the header details followed by detail records and the consecutive files will have detail records and the last... (11 Replies)
Hi all,
I hope you are well. I am very happy to see your contribution. I am eager to become part of it.
I have the following question. I have two huge files to compare (almost 3GB each). The files are simulation outputs. The format of the files are as below
For clear picture, please see... (9 Replies)
I recently went through Understanding the linux kernel, to get an idea of how system calls and interrupts function in an x86 based machine.
However, the level of detail has left me slightly confused. Here's what I understand.
System call process:
User mode:
User code calls a library... (11 Replies)
Hello Friends,
I need to examine a huge CDR file according to a complex (at least for me) condition like below and i couldnt write anything :(
In CDR file there are more than hundreds of fields, I need to print the rows which matches the below condition:
while $13 field of subsequent... (9 Replies)
Hi everyone, i'm newbie to unix!
If I have many files named by temp1.txt, temp2.text, temp3.txt,..... I want to introduce some key words at the top of each of files. The script (named bimbo) I tried is shown.
#!/bin/csh
set FILE=$1
echo $FILE
set OUT=$1
echo '%tage='$FILE >> x_$FILE ... (8 Replies)
Hi Friends,
I have a file with sample amount data as follows:
-89990.3456
8788798.990000128
55109787.20
-12455558989.90876
I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million.
How... (8 Replies)
I have an application that routinely alloc() and realloc() gigabyte blocks of memory for image processing applications; specifically performing rotations of huge images, or creating/ deleting huge image buffers to contain multiple images. Immediately upon completion of an operation I call free() to... (9 Replies)
Please join me in congratulating and thanking Corona688 for 20,000 top quality posts at unix.com !
https://www.unix.com/members/1-albums112-picture651.png (11 Replies)