Visit Our UNIX and Linux User Community


Copy huge files system


 
Thread Tools Search this Thread
Operating Systems AIX Copy huge files system
# 1  
Old 04-05-2011
Copy huge files system

Dear Guy’s
By using dd command or any strong command, I’d like to copy huge data from file system to another file system


Sours File system: /sfsapp
File system has 250 GB of data
Target File system: /tgtapp


I’d like to copy all these files and directories from /sfsapp to /tgtapp as file system copy by using strong command



Pls your advice..
# 2  
Old 04-05-2011
I suppose you want to preserve absolutely everything: files, ownerships, filemodes, ...

I do that (out of habit - presumably there are other equally good ways to do it) using two tar's and a pipeline:

Code:
# cd /path/to/sourcedir
# tar -cf - . | (cd /path/to/targetdir ; tar -xf - )

The first tar packs everything to <stdout>, the second one unpacks from <stdin>, this is what the "-" stands for. If you include a "-v" to one tars options you can even watch the progress (at a small expense of performance of course).

I hope this helps.

bakunin
# 3  
Old 04-05-2011
So my command should to be like this ...

PHP Code:
 
cd 
/sfsapp
tar 
-cf - . | (cd /tgtapp tar -xf - ) 


---------- Post updated at 07:17 AM ---------- Previous update was at 07:10 AM ----------

But the number of the files is very huge .. Are you sure this is will success ?
# 4  
Old 04-05-2011
Quote:
Originally Posted by Mr.AIX
But the number of the files is very huge .. Are you sure this is will success ?
Yes, this will work, regardless of the number of files.

You could use "dd" too, but i would prefer to use an "high-level" approach when possible instead of a "low-level" approach such as "dd". With "dd" it is easily possible to overwrite not only files, but devices, volume group information, things quite vital to your system.

With "tar" you deal with files, directories and the like - things which are easily recognized by you. With "dd" you ultimately deal with devices, which is much more dangerous. If you mistype "/tgtpaath" in the above mentioned "tar" command you will run into a "disk full"-error, have the chance to correct it and start over. If you confuse hdisk127 and hdisk126 in a dd command it will work well but probably destroy data you didn't want to lose.

That doesn't mean at all that "dd" is a bad tool - it is very powerful, but the power comes at the price of being dangerous too. Use it when you need it, not when it is possible, is my principle. Maybe i'm feeble, but i have been working as a Sysadmin for a long time and i am still alive. ;-))

I hope this helps.

bakunin
# 5  
Old 04-05-2011
OK .. can you please explain exactly how to can I use dd command to move the files from /sfsapp to /tgtapp
# 6  
Old 04-05-2011
How about using the backup and restore commands?
# 7  
Old 04-05-2011
Are /sfsapp and /tgtapp mountpoints?
Is there anything in /tgtapp already?
Though you don't mention the version of AIX or anything about the hardware of the disc system, do you have an AIX version which can deal with files larger than 2 Gb ?

The usual solution is to use "find" piped to "cpio -p" .
e.g.
Code:
cd /sfsapp
find . -xdev -print | cpio -pdumv /tgtapp


Previous Thread | Next Thread
Test Your Knowledge in Computers #521
Difficulty: Medium
0b110010000001 = 2^11 + 2^10 + 2^7 + 2^0
True or False?

10 More Discussions You Might Find Interesting

1. Solaris

Split huge File System

Gents I have huge NAS File System as /sys with size 10 TB and I want to Split each 1TB in spirit File System to be mounted in the server. How to can I do that without changing anything in the source. Please your support. (1 Reply)
Discussion started by: AbuAliiiiiiiiii
1 Replies

2. Solaris

Backup for NAS huge File system

Gents, I have NAS File System mounted in Solaris as \Sysapp with size 8 TB the problem once the backup stared it is impacting the performance of the OS. Do you have any idea how to can we backup this FS with fast scenario without impacting the OS. Backup type : Netbackup (3 Replies)
Discussion started by: AbuAliiiiiiiiii
3 Replies

3. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

4. Shell Programming and Scripting

Aggregation of Huge files

Hi Friends !! I am facing a hash total issue while performing over a set of files of huge volume: Command used: tail -n +2 <File_Name> |nawk -F"|" -v '%.2f' qq='"' '{gsub(qq,"");sa+=($156<0)?-$156:$156}END{print sa}' OFMT='%.5f' Pipe delimited file and 156 column is for hash totalling.... (14 Replies)
Discussion started by: Ravichander
14 Replies

5. Shell Programming and Scripting

Copy files with pattern from ext4 to cifs file system

Hi I have a shell script to copy a pattern of files from Linux to Windows Filesystem. When i execute the below command cp -av TOUT_05-02-13* Windows/Folder `TOUT_05-02-13-19:02:37.tar.gz' -> `Windows/Folder/SYSOUT_05-02-13-19:02:37.tar.gz' cp: cannot create regular file... (5 Replies)
Discussion started by: rakeshkumar
5 Replies

6. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

7. Shell Programming and Scripting

Compare 2 folders to find several missing files among huge amounts of files.

Hi, all: I've got two folders, say, "folder1" and "folder2". Under each, there are thousands of files. It's quite obvious that there are some files missing in each. I just would like to find them. I believe this can be done by "diff" command. However, if I change the above question a... (1 Reply)
Discussion started by: jiapei100
1 Replies

8. UNIX for Dummies Questions & Answers

copy and paste certain many lines of huge file in linux

Dear All, I am working with windoes OS but remote a linux machine. I wonder the way to copy an paste some part of a huge file in linux machine. the contain of file like as follow: ... dump annealling all custom 10 anneal_*.dat id type x y z q timestep 0.02 run 200000 Memory... (2 Replies)
Discussion started by: ariesto
2 Replies

9. UNIX for Advanced & Expert Users

Huge files manipulation

Hi , i need a fast way to delete duplicates entrys from very huge files ( >2 Gbs ) , these files are in plain text. I tried all the usual methods ( awk / sort /uniq / sed /grep .. ) but it always ended with the same result (memory core dump) In using HP-UX large servers. Any advice will... (8 Replies)
Discussion started by: Klashxx
8 Replies

10. UNIX for Dummies Questions & Answers

Difference between two huge files

Hi, As per my requirement, I need to take difference between two big files(around 6.5 GB) and get the difference to a output file without any line numbers or '<' or '>' in front of each new line. As DIFF command wont work for big files, i tried to use BDIFF instead. I am getting incorrect... (13 Replies)
Discussion started by: pyaranoid
13 Replies

Featured Tech Videos