HUGETLBFS_FIND_PATH(3) Library Functions Manual HUGETLBFS_FIND_PATH(3)NAME
hugetlbfs_find_path, hugetlbfs_find_path_for_size - Locate an appropriate hugetlbfs mount point
SYNOPSIS
#include <hugetlbfs.h>
const char *hugetlbfs_find_path(void);
const char *hugetlbfs_find_path_for_size(long page_size);
DESCRIPTION
These functions return a pathname for a mounted hugetlbfs filesystem for the appropriate huge page size. For hugetlbfs_find_path, the
default huge page size is used (see gethugepagesize(3)). For hugetlbfs_find_path_for_size, a valid huge page size must be specified (see
gethugepagesizes(3)).
RETURN VALUE
On success, a non-NULL value is returned. On failure, NULL is returned.
SEE ALSO libhugetlbfs(7), gethugepagesize(3), gethugepagesizes(3)AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list.
March 7, 2012 HUGETLBFS_FIND_PATH(3)
Check Out this Related Man Page
GETPAGESIZES(3) Library Functions Manual GETPAGESIZES(3)NAME
getpagesizes - Get the system supported huge page sizes
SYNOPSIS
#include <hugetlbfs.h>
int getpagesizes(long pagesizes[], int n_elem);
DESCRIPTION
The getpagesizes() function returns either the number of system supported page sizes or the sizes themselves. If pagesizes is NULL and
n_elem is 0, then the number of pages the system supports is returned. Otherwise, pagesizes is filled with at most n_elem page sizes.
RETURN VALUE
On success, either the number of page sizes supported by the system or the number of page sizes stored in pagesizes is returned. On fail-
ure, -1 is returned and errno is set appropriately.
ERRORS
EINVAL n_elem is less than zero or n_elem is greater than zero and pagesizes is NULL.
Also see opendir(3) for other possible values for errno. This error occurs when the sysfs directory exists but cannot be opened.
NOTES
This call will return all page sizes as reported by the kernel. Not all of these sizes may be usable by the programmer since mount points
may not be available for the huge page sizes. To test whether a size will be usable by libhugetlbfs, hugetlbfs_find_path_for_size() can be
called on a specific size to see if a mount point is configured.
SEE ALSO oprofile(1), opendir(3), gethugepagesizes(3), libhugetlbfs(7)AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list.
October 10, 2008 GETPAGESIZES(3)
hi there guys,
wonder if any gurus can help me out on this one...
try searching the past threads but cant find anything.
i have this huge file but when i use vi to open it it gives me the following error:
<"pmrepserver.txt""/var/tmp/Ex86200" There is not enough space in the file... (7 Replies)
I have to extract data from a text file which is huge in size >>10GB.
ie between two strings. If I do an ordinary sed it takes forever to come out. I was wondering if there was anyway to do the entire process in reverse and on finding the relevant string is there any way to break out of the... (5 Replies)
Hi,
I have a huge file of bibliographic records in some standard format.I need a script to do some repeatable task as follows:
1. Needs to create folders as the strings starts with "item_*" from the input file
2. Create a file "contents" in each folders having "license.txt(tab... (5 Replies)
hi all
i have a some huge html files (500MB to 1GB). Each file has multiple
<html></html> tags
<html>
.................
....................
....................
</html>
<html>
.................
....................
....................
</html>
<html>
.................... (5 Replies)
I have a huge file and want to separate it into several subsets.
The file looks like:
C1 C2 C3 C4 ... (variable names)
1 ....
2 ....
3 ....
:
22 ....
23 ....
I want to separate the huge file using the column 1, which has numbers from 1 to 23 (but there are different amount of... (8 Replies)
111111111100000000001111111111
123232323200000010001114545454
232435424200000000001232131212
342354234301000000002323423443
232435424200000000001232131212
2390898994200000000001238908092
This is the record format.
From 11th position to 20th position in a record there are 0's occuring,and... (6 Replies)
I have few AIX 5.3 boxes where following is the issue.
I have a variable whose value is a very huge string ...(5000+ characters)
CMD_ARGS="/global/site/vendor/WAS/WebSphere6/AppServer/java/bin/java... (7 Replies)
Hi All,
HP-UX dev4 B.11.11 U 9000/800 3251073457
I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this?
... (18 Replies)
Hi Ppl,
I have a requirement like i will be getting files of huge size daily and if the file size is so huge ,the files will be split into many parts and sent.The first file will have the header details followed by detail records and the consecutive files will have detail records and the last... (11 Replies)
Hi all,
I hope you are well. I am very happy to see your contribution. I am eager to become part of it.
I have the following question. I have two huge files to compare (almost 3GB each). The files are simulation outputs. The format of the files are as below
For clear picture, please see... (9 Replies)
Hi everyone, i'm newbie to unix!
If I have many files named by temp1.txt, temp2.text, temp3.txt,..... I want to introduce some key words at the top of each of files. The script (named bimbo) I tried is shown.
#!/bin/csh
set FILE=$1
echo $FILE
set OUT=$1
echo '%tage='$FILE >> x_$FILE ... (8 Replies)
Hello Experts
I have a requirement wherein I need to fetch multiple data from huge dump
egrep -f Pattern.txt Dump.txt
My pattern file has got like 300 entries and Dump file is like 8GB data.
It taking eternity to complete on my machine.
Is their a faster way to search pattern like using... (5 Replies)
Hi Friends,
I have a file with sample amount data as follows:
-89990.3456
8788798.990000128
55109787.20
-12455558989.90876
I need to exclude the '-' symbol in order to treat all values as an absolute one and then I need to sum up.The record count is around 1 million.
How... (8 Replies)
Hello All,
Let's join our hands together to appreciate Corona688 for completing and reaching to the landmark of 4000+ THANKS. I do want to appreciate Corona688 on behalf of everyone here in forum for Corona688's continuous effort of posting very useful, full of experience and knowledge posts in... (7 Replies)