Copy huge files system


 
Thread Tools Search this Thread
Operating Systems AIX Copy huge files system
# 8  
Old 04-05-2011
I'm talking about dd command, can you please explain about dd command

my AIX version is : AIX 6.1

there is nothing in /tgtapp and it's empty mount point files system has the same size of /sfsapp

/sfsapp and /tgtapp are mount points and each one mointed on it own disk

Last edited by Mr.AIX; 04-05-2011 at 11:57 AM..
# 9  
Old 04-05-2011
Quote:
Originally Posted by Mr.AIX
I'm talking about dd command, can you please explain about dd command
Quite frankly, i don't understand your obsession with "dd", given that you have been offered several good alternatives. You will not gain anything by using "dd", not speed, not ease of use and definitely not safety. To me that starts to sound like homework.

Here is your solution, use at your own risk, i haven't tested it.

First find out the devices which are mounted at the respective mountpoints with the "mount" command, then use these devices as infile and outfile of "dd".

Example: copy "/ap101" to "/ap102".

Code:
# mount
  node       mounted        mounted over    vfs       date        options      
-------- ---------------  ---------------  ------ ------------ --------------- 
<...SNIP...>
         /dev/lvap101     /ap101           jfs2   Nov 29 11:33 rw,log=/dev/logapdev1vg
         /dev/lvap102     /ap102           jfs2   Nov 29 11:33 rw,nodev,nosuid,log=/dev/logapdev1vg
<...SNIP...>

# dd if=/dev/lvap101 of=/dev/lvap102 bs=512

I hope this helps.

bakunin
# 10  
Old 04-06-2011
I have used the below command..
Code:
dd if=/dev/sfsapplv of=/dev/tgtapplv bs=512

but it's taking long time more than 30 min's without coping any thing to /tgtapp

After I canceled that , I got the below result
Code:
2352183+0 records in.
2352183+0 records out.


Can any one explain.. ?

Last edited by zaxxon; 04-06-2011 at 08:01 AM.. Reason: time to use code tags
# 11  
Old 04-06-2011
Quote:
Originally Posted by Mr.AIX
but it's taking long time more than 30 min's without coping any thing to /tgtapp
I have no idea about your disks performance, but a quick estimation tells me: if the target disk can write at 20MB/s continuously then it will take (250.000/20=) 12500 seconds to write 250GB - 12500/3600 ~ 3.5 hours. I doubt your disk (LUN, whatever) can write at the ~140 MB/s (continuous!) necessary to finish the job in 30 minutes.

Quote:
Originally Posted by Mr.AIX
After I canceled that , I got the below result
Code:
2352183+0 records in.
2352183+0 records out.

Can any one explain.. ?
Could you be bothered to read the man page of "dd", after insisting on this tool, before asking here? It tells you that it has - until you cancelled it - 2352183 pieces of the size given in the command in option "bs" read and written as many. As you issued

Code:
dd if=/dev/sfsapplv of=/dev/tgtapplv bs=512

it read 2352183x512=1204317696 bytes and wrote as many.

Again, i don't think it is advisable to use "dd" for that task (even more it is dangerous in the hand of someone not completely knowing what he is doing) and i won't answer questions easily answered by looking at the man page again.

I hope this helps.

bakunin
# 12  
Old 04-06-2011
it's huge data , I have tried to use your suggested command but it got hanged after some time and it did not copy whole the data

do you have any recommended command will be strong command to copy whole the files from /sfsapp to the new mount point /tgtapp
# 13  
Old 04-06-2011
Have you tried the three other suggestions that were posted? If not start there.
# 14  
Old 04-06-2011
Quote:
Originally Posted by Mr.AIX
I have tried to use your suggested command but it got hanged after some time
You haven't said so until now. If so, what exactly was the error? The method i described (tar) has worked for me countless times, also on such amounts of data and even more.

Using "backup" and "restore", "cpio" or "savevg" will equally work and probably at roughly the same speed as "tar".

One thing though: you can't cancel a job after some time and expect it to have finished the task - if you have to cancel it that means it was still running and if it was running it means it hasn't finished what it was doing.

I hope this helps.

bakunin
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies

2. Solaris

Backup for NAS huge File system

Gents, I have NAS File System mounted in Solaris as \Sysapp with size 8 TB the problem once the backup stared it is impacting the performance of the OS. Do you have any idea how to can we backup this FS with fast scenario without impacting the OS. Backup type : Netbackup (3 Replies)
Discussion started by: AbuAliiiiiiiiii
3 Replies

3. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

4. Shell Programming and Scripting

Aggregation of Huge files

Hi Friends !! I am facing a hash total issue while performing over a set of files of huge volume: Command used: tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f' Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies

5. Shell Programming and Scripting

Copy files with pattern from ext4 to cifs file system

Hi I have a shell script to copy a pattern of files from Linux to Windows Filesystem. When i execute the below command cp -av TOUT_05-02-13* Windows/Folder `TOUT_05-02-13-19:02:37.tar.gz' -> `Windows/Folder/SYSOUT_05-02-13-19:02:37.tar.gz' cp: cannot create regular file... (5 Replies)
Discussion started by: rakeshkumar
5 Replies

6. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

7. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

8. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

9. UNIX for Advanced & Expert Users

Huge files manipulation

Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text. I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump) In using HP-UX large servers. Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies

10. UNIX for Dummies Questions & Answers

Difference between two huge files

Hi, As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line. As DIFF command wont work for big files, i tried to use BDIFF instead. I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies
Login or Register to Ask a Question