Copy data over a TB


 
Thread Tools Search this Thread
Operating Systems Solaris Copy data over a TB
# 1  
Old 12-23-2011
Copy data over a TB

Hi All,

We are not able to grow a UFS filesystem since its size will be going over a TB and it wasn't created using -T with newfs.

Hence we have decided to take the backup of all the files on another filesystem and recreate it using -T with newfs.

Please recommend the most reliable command to copy data around a TB.

We have thought of using ufsdump and ufsrestore as below.
Code:
 
ufsdump 0f - /dev/md/rdsk/d251 | (cd /NEW_FS; ufsrestore xf -)

Since ufsdump will write on standard output and ufsrestore will read from standard input, I am not sure will the buffer be able to hold around a TB data.

The other option to copy is cp -pr but I am not sure the links will be preserved through this.

Please advice.

The OS is Solaris 9 update 8.

Regards,
Vishal
# 2  
Old 12-23-2011
Copy data over a TB

Hi vishalaswani, there is a big difference in the copy performance of a filesystem if you have a lot of little files or a few big files, I think that if you are in the first case, you can use the command you explain, but if you are in the second case, perhaps you can try rsync, it will be faster, the command will be, for instance:

rsync -Havx ORIGINAL_FOLDER DESTINATION_FOLDER

Where DESTINATION_FOLDER can be a local folder or something like server:/folder, whit this parameters you will have a mirror copy, including the links. You can use -e rsh in order to avoid the default ssh option. Of course, if the folder structure is really big, you can launch a "foreach" command whit a rsync by folder, so you can copy several folders at the same time which can not be done with ufsdump. I'm copying around 4,5TB all nights from my NFS main server to a slow disk machines which a script based in this idea.

Also, you can use the same command with tar instead of ufsdump, it's a little bit faster, but remember the E option if you use long names.
# 3  
Old 12-23-2011
is the -T option in solaris9? maybe you can do an OS upgrade and use ZFS which is easier to manage...
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Solaris

The Fastest for copy huge data

Dear Experts, I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers? I already tried using Rsync and tar command. But using these command is too long. Please advice. Thanks Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies

2. Shell Programming and Scripting

Copy and paste data

I need to copy from specified lines and paste the data into several other lines. XX123450008 xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x XX123451895 xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x ...... XX123452012 xx.x xx.x xx.x xx.x xx.x xx.x xx.x... (13 Replies)
Discussion started by: ncwxpanther
13 Replies

3. UNIX for Dummies Questions & Answers

Copy huge data into vi editor

Hi All, HP-UX dev4 B.11.11 U 9000/800 3251073457 I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this? ... (18 Replies)
Discussion started by: alok.behria
18 Replies

4. Solaris

copy data from one old filesystem to newfilesystem

Hi gurus I configured raid 5 volume n ive created a filesystem and mounted it to a directory also...everythin is ready..the purpose of doing it is to move my data from an old filesystem pin02 to the newly created filesystem pin02_new...plz tel me the steps to move data without any... (10 Replies)
Discussion started by: madanmeer
10 Replies

5. Solaris

Copy data from zfs datasets

I 've few data sets in my zfs pool which has been exported to the non global zones and i want to copy data on those datasets/file systems to my datasets in new pool mounted on global zone, how can i do that ? (2 Replies)
Discussion started by: fugitive
2 Replies

6. UNIX for Dummies Questions & Answers

Copy data from tape in unix

Dear sir/madam Could you tell me how to copy or get data from tape to any folder in unix? Thanks, (2 Replies)
Discussion started by: seyha_moth
2 Replies

7. AIX

db2 - copy data from one db to another

I need to take data from one database to another on the same machine. My first attempt has been using restore, but I am getting an error. Here is the command I am trying: db2 restore database prod into test I get the following error: The container is already in use. What would... (2 Replies)
Discussion started by: jyoung
2 Replies

8. UNIX for Dummies Questions & Answers

copy, clear and keep to new data only

I have an Ingres database logfile that grows constantly, iircp.log. It is always "attached" to the Ingres process that uses it, and I do not want to screw up the data. I have been copying it to another directory and then using vi on the original to reduce the size 34000 lines at a time. What I want... (1 Reply)
Discussion started by: sarge
1 Replies

9. UNIX for Dummies Questions & Answers

copy prod data to QA

We keep getting production files into an input directory. These files will be processed three times a day. 8:00AM 1:30PM 5:30PM file1.20041005_05303423.dat File2.200041005_14232313.dat Once the files are processed, they are archived immediately to prod/archive directory ... (1 Reply)
Discussion started by: zomboo
1 Replies

10. UNIX for Advanced & Expert Users

copy data from Windows to Unix any help?

I have a log file that I would like to copy from my windows pc to my unix pc for further processuon. Is there any command that can help me perform this task?. I already have a cron to fire up the process, but nothing seems to be coming up. I am trying to use ftp but nothing is coming forth.... (8 Replies)
Discussion started by: odogbolu98
8 Replies
Login or Register to Ask a Question