Copy huge files system


 
Thread Tools Search this Thread
Operating Systems AIX Copy huge files system
# 15  
Old 04-06-2011
I don't know AIX, but with the OS's I know unmounting the destination prior to running the dd is a must. I would unmount the source as well if I could. This is one of several reason why a dd in this case is not a great idea.
# 16  
Old 04-07-2011
The belwo command did not work with me

PHP Code:
dd if=/dev/sfsapplv of=/dev/tgtapplv bs=512 

and

The below command worked but after 3 hours finished but the data is not completed.
whole data in the source mount porint is 80% after finish the copying data came in the target mount porint as 30%


PHP Code:
cd /sfsapp
tar 
-cf - . | (cd /tgtapp tar -xf - ) 
guy's

I'm looking for strong command dd or cp ... etc whatever I want to copy these huage data to the anothere mount porint ..



Pls advice ..

Last edited by Mr.AIX; 04-07-2011 at 02:39 PM..
# 17  
Old 04-07-2011
Compare number of files and directories instead of looking at df output.

If things still don't look right you might try installing the rsync rpm.

IBM AIX Toolbox Download Page - By Date

Syntax
Code:
rsync -a /sfsapp/ /tgtapp/

You can throw a -P in there which will show progress and a -n will give you a dry run to so you can see what it would look like before a real run. Also a -v if you want verbose output.

Last edited by juredd1; 04-07-2011 at 02:59 PM.. Reason: Fix syntax
# 18  
Old 04-08-2011
Please give me more information about

PHP Code:
rsync -/sfsapp/ /tgtapp
let me know , what will do it exactly ?

---------- Post updated at 02:58 AM ---------- Previous update was at 02:48 AM ----------

can anyone provide list of all the dd copy command options
# 19  
Old 04-08-2011
Quote:
Originally Posted by Mr.AIX
The below command worked but after 3 hours finished but the data is not completed.
whole data in the source mount porint is 80% after finish the copying data came in the target mount porint as 30%
This sounds quite suspicious. Could you please post the AIX version you are using (and, if applies, if you are using a 32-bit or 64-bit version?). This sounds like an older version not capable of dealing with files larger than 8G (ustar) or maybe hitting a 2G file size limit.

While you are at it: output of "ulimit -a" might also be interesting.

bakunin
# 20  
Old 04-09-2011
Quote:
Originally Posted by bakunin
This sounds quite suspicious. Could you please post the AIX version you are using (and, if applies, if you are using a 32-bit or 64-bit version?). This sounds like an older version not capable of dealing with files larger than 8G (ustar) or maybe hitting a 2G file size limit.

While you are at it: output of "ulimit -a" might also be interesting.

bakunin

AIX vertion is : AIX 6.1
64-bit


---------- Post updated at 03:06 AM ---------- Previous update was at 02:54 AM ----------

I'm using now this command till now it's running ... let's see

PHP Code:
rsync -/sfsapp and /tgtapp 
---------- Post updated at 03:59 AM ---------- Previous update was at 03:06 AM ----------

got failed ..! Pls advice ...

PHP Code:
 
rsync
writefd_unbuffered failed to write 4092 bytes to socket [sender]: Broken pipe (32)
rsyncconnection unexpectedly closed (14471473 bytes received so far) [sender]
rsync errorerror allocating core memory buffers (code 22at io.c(600) [sender=3.0.6
# 21  
Old 04-09-2011
Hi Mr. AIX,

We'll find you a good command to copy your data but I'm a little concerned you're going to fill up your file system or something!

Let's make sure the basics are OK before we lump 250GB around.

Can you show the output of:
Code:
df -k /sfsapp
df -k /tgtapp
lsattr -El sys0 | grep realmem

The command you posted doesn't match the one the one juredd1 posted. This will do a test run of an rsync:
Code:
rsync -na /sfsapp/ /tgtapp

Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies

2. Solaris

Backup for NAS huge File system

Gents, I have NAS File System mounted in Solaris as \Sysapp with size 8 TB the problem once the backup stared it is impacting the performance of the OS. Do you have any idea how to can we backup this FS with fast scenario without impacting the OS. Backup type : Netbackup (3 Replies)
Discussion started by: AbuAliiiiiiiiii
3 Replies

3. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

4. Shell Programming and Scripting

Aggregation of Huge files

Hi Friends !! I am facing a hash total issue while performing over a set of files of huge volume: Command used: tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f' Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies

5. Shell Programming and Scripting

Copy files with pattern from ext4 to cifs file system

Hi I have a shell script to copy a pattern of files from Linux to Windows Filesystem. When i execute the below command cp -av TOUT_05-02-13* Windows/Folder `TOUT_05-02-13-19:02:37.tar.gz' -> `Windows/Folder/SYSOUT_05-02-13-19:02:37.tar.gz' cp: cannot create regular file... (5 Replies)
Discussion started by: rakeshkumar
5 Replies

6. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

7. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

8. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

9. UNIX for Advanced & Expert Users

Huge files manipulation

Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text. I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump) In using HP-UX large servers. Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies

10. UNIX for Dummies Questions & Answers

Difference between two huge files

Hi, As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line. As DIFF command wont work for big files, i tried to use BDIFF instead. I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies
Login or Register to Ask a Question