Copy huge files system


 
Thread Tools Search this Thread
Operating Systems AIX Copy huge files system
# 29  
Old 04-22-2011
If you would read your own threads than you would have seen this was suggested earlier - together with a few mount options that would make your copy even faster ... see #24

Regards
zxmaus
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies

2. Solaris

Backup for NAS huge File system

Gents, I have NAS File System mounted in Solaris as \Sysapp with size 8 TB the problem once the backup stared it is impacting the performance of the OS. Do you have any idea how to can we backup this FS with fast scenario without impacting the OS. Backup type : Netbackup (3 Replies)
Discussion started by: AbuAliiiiiiiiii
3 Replies

3. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

4. Shell Programming and Scripting

Aggregation of Huge files

Hi Friends !! I am facing a hash total issue while performing over a set of files of huge volume: Command used: tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f' Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies

5. Shell Programming and Scripting

Copy files with pattern from ext4 to cifs file system

Hi I have a shell script to copy a pattern of files from Linux to Windows Filesystem. When i execute the below command cp -av TOUT_05-02-13* Windows/Folder `TOUT_05-02-13-19:02:37.tar.gz' -> `Windows/Folder/SYSOUT_05-02-13-19:02:37.tar.gz' cp: cannot create regular file... (5 Replies)
Discussion started by: rakeshkumar
5 Replies

6. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

7. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

8. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

9. UNIX for Advanced & Expert Users

Huge files manipulation

Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text. I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump) In using HP-UX large servers. Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies

10. UNIX for Dummies Questions & Answers

Difference between two huge files

Hi, As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line. As DIFF command wont work for big files, i tried to use BDIFF instead. I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies
Login or Register to Ask a Question
GETHUGEPAGESIZES(3)					     Library Functions Manual					       GETHUGEPAGESIZES(3)

NAME
gethugepagesizes - Get the system supported huge page sizes SYNOPSIS
#include <hugetlbfs.h> int gethugepagesizes(long pagesizes[], int n_elem); DESCRIPTION
The gethugepagesizes() function returns either the number of system supported huge page sizes or the sizes themselves. If pagesizes is NULL and n_elem is 0, then the number of huge pages the system supports is returned. Otherwise, pagesizes is filled with at most n_elem page sizes. RETURN VALUE
On success, either the number of huge page sizes supported by the system or the number of huge page sizes stored in pagesizes is returned. On failure, -1 is returned and errno is set appropriately. ERRORS
EINVAL n_elem is less than zero or n_elem is greater than zero and pagesizes is NULL. Also see opendir(3) for other possible values for errno. This error occurs when the sysfs directory exists but cannot be opened. NOTES
This call will return all huge page sizes as reported by the kernel. Not all of these sizes may be usable by the programmer since mount points may not be available for all sizes. To test whether a size will be usable by libhugetlbfs, hugetlbfs_find_path_for_size() can be called on a specific size to see if a mount point is configured. SEE ALSO
oprofile(1), opendir(3), hugetlbfs_find_path_for_size(3), libhugetlbfs(7) AUTHORS
libhugetlbfs was written by various people on the libhugetlbfs-devel mailing list. October 10, 2008 GETHUGEPAGESIZES(3)