12-23-2011
Copy data over a TB
Hi vishalaswani, there is a big difference in the copy performance of a filesystem if you have a lot of little files or a few big files, I think that if you are in the first case, you can use the command you explain, but if you are in the second case, perhaps you can try rsync, it will be faster, the command will be, for instance:
rsync -Havx ORIGINAL_FOLDER DESTINATION_FOLDER
Where DESTINATION_FOLDER can be a local folder or something like server:/folder, whit this parameters you will have a mirror copy, including the links. You can use -e rsh in order to avoid the default ssh option. Of course, if the folder structure is really big, you can launch a "foreach" command whit a rsync by folder, so you can copy several folders at the same time which can not be done with ufsdump. I'm copying around 4,5TB all nights from my NFS main server to a slow disk machines which a script based in this idea.
Also, you can use the same command with tar instead of ufsdump, it's a little bit faster, but remember the E option if you use long names.
10 More Discussions You Might Find Interesting
1. UNIX for Advanced & Expert Users
I have a log file that I would like to copy from my windows pc to my unix pc for further processuon. Is there any command that can help me perform this task?. I already have a cron to fire up the process, but nothing seems to be coming up.
I am trying to use ftp but nothing is coming forth.... (8 Replies)
Discussion started by: odogbolu98
8 Replies
2. UNIX for Dummies Questions & Answers
We keep getting production files into an input directory.
These files will be processed three times a day.
8:00AM
1:30PM
5:30PM
file1.20041005_05303423.dat
File2.200041005_14232313.dat
Once the files are processed, they are archived immediately to prod/archive directory
... (1 Reply)
Discussion started by: zomboo
1 Replies
3. UNIX for Dummies Questions & Answers
I have an Ingres database logfile that grows constantly, iircp.log. It is always "attached" to the Ingres process that uses it, and I do not want to screw up the data. I have been copying it to another directory and then using vi on the original to reduce the size 34000 lines at a time. What I want... (1 Reply)
Discussion started by: sarge
1 Replies
4. AIX
I need to take data from one database to another on the same machine.
My first attempt has been using restore, but I am getting an error.
Here is the command I am trying:
db2 restore database prod into test
I get the following error:
The container is already in use.
What would... (2 Replies)
Discussion started by: jyoung
2 Replies
5. UNIX for Dummies Questions & Answers
Dear sir/madam
Could you tell me how to copy or get data from tape to any folder in unix?
Thanks, (2 Replies)
Discussion started by: seyha_moth
2 Replies
6. Solaris
I 've few data sets in my zfs pool which has been exported to the non global zones and i want to copy data on those datasets/file systems to my datasets in new pool mounted on global zone, how can i do that ? (2 Replies)
Discussion started by: fugitive
2 Replies
7. Solaris
Hi gurus
I configured raid 5 volume n ive created a filesystem and mounted it to a directory also...everythin is ready..the purpose of doing it is to move my data from an old filesystem pin02 to the newly created filesystem pin02_new...plz tel me the steps to move data without any... (10 Replies)
Discussion started by: madanmeer
10 Replies
8. UNIX for Dummies Questions & Answers
Hi All,
HP-UX dev4 B.11.11 U 9000/800 3251073457
I need to copy huge data from windows text file to vi editor. when I tried copy huge data, the format of data is not preserverd and appered to scatterd through the vi, something like give below. Please let me know, how can I correct this?
... (18 Replies)
Discussion started by: alok.behria
18 Replies
9. Shell Programming and Scripting
I need to copy from specified lines and paste the data into several other lines.
XX123450008 xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x
XX123451895 xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x xx.x
......
XX123452012 xx.x xx.x xx.x xx.x xx.x xx.x xx.x... (13 Replies)
Discussion started by: ncwxpanther
13 Replies
10. Solaris
Dear Experts,
I would like to know what's the best method for copy data around 3 mio (spread in a hundred folders, size each file around 1kb) between 2 servers?
I already tried using Rsync and tar command. But using these command is too long.
Please advice.
Thanks
Edy (11 Replies)
Discussion started by: edydsuranta
11 Replies
LEARN ABOUT FREEBSD
dumpfs
DUMPFS(8) BSD System Manager's Manual DUMPFS(8)
NAME
dumpfs -- dump UFS file system information
SYNOPSIS
dumpfs [-f] [-l] [-m] filesys | device
DESCRIPTION
The dumpfs utility prints out the UFS super block and cylinder group information for the file system or special device specified, unless the
-f, -l or -m flag is specified. The listing is very long and detailed. This command is useful mostly for finding out certain file system
information such as the file system block size and minimum free space percentage.
If -f is specified, a sorted list of all free fragments and free fragment ranges, as represented in cylinder group block free lists, is
printed. If the flag is specified twice, contiguous free fragments are not collapsed into ranges and instead printed in a simple list.
Fragment numbers may be converted to raw byte offsets by multiplying by the fragment size, which may be useful when recovering deleted data.
If -l is specified, the pathname to the file system's container derived from its unique identifier is printed.
If -m is specified, a newfs(8) command is printed that can be used to generate a new file system with equivalent settings. Please note that
newfs(8) options -E, -R, -S, and -T are not handled and -p is not useful in this case so is omitted. Newfs(8) options -n and -r are neither
checked for nor output but should be. The -r flag is needed if the filesystem uses gjournal(8).
SEE ALSO
disktab(5), fs(5), disklabel(8), fsck(8), newfs(8), tunefs(8)
HISTORY
The dumpfs utility appeared in 4.2BSD.
BSD
May 16, 2013 BSD