Copy huge files system


 
Thread Tools Search this Thread
Operating Systems AIX Copy huge files system
# 22  
Old 04-09-2011
This is the output .

PHP Code:
 
root
>df -/sfsapp
/dev/sfsapplv   212129280  36861824   8015982043    43% /sfsapp
 
root
>df -/tgtapp
/dev/tgtapplv   212129280 158892800   26%  6195440    49% /tgtapp
 
 
root
>lsattr -El sys0 grep realmem
realmem         10485760           Amount of usable physical memory in       Kbytes        False
 
 
 
root
>rsync -na /sfsapp/ /tgtapp/  
 
ERRORout of memory in make_file [sender]
rsync errorerror allocating core memory buffers (code 22at util.c(117) [sender=3.0.6

Last edited by Mr.AIX; 04-09-2011 at 11:39 AM..
# 23  
Old 04-09-2011
OK, you convinced me your data is huge. Smilie You must have many files/folders/links. I have a file system like this. I have to rsync subfolders to keep it from failing.

If /tgtapp is being used for other things:
  • You may not have enough free space.
  • You probably have some mess to clean up from your prior attempts.
If /tgtapp is a new file system just for this purpose, clean it up. This deletes everything so make sure you mean it.
Code:
rm -fr /tgtapp/*

I would expect the commands from bakunin and methyl to work.
Code:
cd /sfsapp; tar cpf - . | (cd /tgtapp; tar xpvf -)

Code:
cd /sfsapp; find . -xdev -print | cpio -pdumv /tgtapp

You didn't mention any errors from the tar/cpio you tried. Did it complete silently?

I put the 'v' (verbose) option on the last commands. For now it's probably worth seeing what it's doing. Shrink down your window to speed up the transfer.
# 24  
Old 04-09-2011
as long as both filesystems are on the same box - mount the filesystems both with nolog and noatime and a simple cp -hpr will do the trick for you probably as good as any of the other commands ... make sure that the VG is a scalable one ...

Regards
zxmaus
# 25  
Old 04-10-2011
Quote:
Originally Posted by Nevyn
OK, you convinced me your data is huge. Smilie You must have many files/folders/links. I have a file system like this. I have to rsync subfolders to keep it from failing.

If /tgtapp is being used for other things:
  • You may not have enough free space.
  • You probably have some mess to clean up from your prior attempts.
If /tgtapp is a new file system just for this purpose, clean it up. This deletes everything so make sure you mean it.
Code:
rm -fr /tgtapp/*

I would expect the commands from bakunin and methyl to work.
Code:
cd /sfsapp; tar cpf - . | (cd /tgtapp; tar xpvf -)

Code:
cd /sfsapp; find . -xdev -print | cpio -pdumv /tgtapp

You didn't mention any errors from the tar/cpio you tried. Did it complete silently?

I put the 'v' (verbose) option on the last commands. For now it's probably worth seeing what it's doing. Shrink down your window to speed up the transfer.

/tgtapp/ has the same space of /sfsapp/

I have removed the /tgtapp/ then I re create it again now it's empty it has the same space of /sfsapp

now I'm using this command

PHP Code:
cd /sfsapptar cpf - . | (cd /tgtapptar xpvf -) 
it's running after some time got hanged and it's seems is not running.

and now I'm using this command


PHP Code:
cd /sfsappfind . -xdev -print | cpio -pdumv /tgtapp 
it's running .. it seems will get hang also .. let's see ..

---------- Post updated at 02:36 AM ---------- Previous update was at 02:36 AM ----------

both of the below commands got hanged ..

PHP Code:
cd /sfsapptar cpf - . | (cd /tgtapptar xpvf -) 
 
cd /sfsappfind . -xdev -print | cpio -pdumv /tgtapp 

I'm looking for strong command to copy the data from /sfsapp to /tgtapp

Pls help ..

Last edited by Mr.AIX; 04-10-2011 at 04:32 AM..
# 26  
Old 04-10-2011
I'm wondering if the source file system has errors. Can you unmount it to perform an fsck or is it in production use?

The copies should have shown you which files they were on.

Did they hang on the same file / folder? Similar size transferred? If so, is there anything special about that folder / file?

What made you decide they were hung? (i.e. no disk growth, no cpu usage)
# 27  
Old 04-11-2011
How about Cp command

How about the copy (cp) command.

Try:

# cp -rph source_directory destination_directory.
# 28  
Old 04-22-2011
Quote:
Originally Posted by manojbiswakarma
How about the copy (cp) command.

Try:

# cp -rph source_directory destination_directory.

I'm using your command , it seems good command ! Smilie

I'll give the update about it !
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies

2. Solaris

Backup for NAS huge File system

Gents, I have NAS File System mounted in Solaris as \Sysapp with size 8 TB the problem once the backup stared it is impacting the performance of the OS. Do you have any idea how to can we backup this FS with fast scenario without impacting the OS. Backup type : Netbackup (3 Replies)
Discussion started by: AbuAliiiiiiiiii
3 Replies

3. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

4. Shell Programming and Scripting

Aggregation of Huge files

Hi Friends !! I am facing a hash total issue while performing over a set of files of huge volume: Command used: tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f' Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies

5. Shell Programming and Scripting

Copy files with pattern from ext4 to cifs file system

Hi I have a shell script to copy a pattern of files from Linux to Windows Filesystem. When i execute the below command cp -av TOUT_05-02-13* Windows/Folder `TOUT_05-02-13-19:02:37.tar.gz' -> `Windows/Folder/SYSOUT_05-02-13-19:02:37.tar.gz' cp: cannot create regular file... (5 Replies)
Discussion started by: rakeshkumar
5 Replies

6. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

7. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

8. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

9. UNIX for Advanced & Expert Users

Huge files manipulation

Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text. I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump) In using HP-UX large servers. Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies

10. UNIX for Dummies Questions & Answers

Difference between two huge files

Hi, As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line. As DIFF command wont work for big files, i tried to use BDIFF instead. I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies
Login or Register to Ask a Question