Copy large dump file


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Copy large dump file
# 1  
Old 04-07-2008
Copy large dump file

Hi Experts..

Could anyone please let me know the easier way to copy large dump of files from one server to another. I am trying to copy a set of dump files on two different servers placed in different geographic locations.. Though there are other factors such as latency, etc., slowing up the process, i am curious to know if there are any other method to copy it fast.. I am using scp functionality to copy at present..

Thanks in advance !!
# 2  
Old 04-07-2008
Not sure what OS you're using, but you could try using the 'split' command to break the files down to smaller chunks, then use the 'cat' command on the otherside to join them back together:

i.e.
split -b 1m largefile FILE_

then to rejoin:

cat FILE_* > largefile

Hope this helps.
# 3  
Old 04-07-2008
thanks much for ur reply.. We are using AIX..
# 4  
Old 04-07-2008
If you are not using scp compression, add that.

Maybe look at rsync as well. It can use scp as the channel, but add some intelligence to avoid needless transfers.
# 5  
Old 04-07-2008
You could look at rsync+ssh, and give ssh the "blowfish" encryption option as it is supposed to be faster than the default.

Like so:
Code:
rsync -avz --ignore-existing -e  'ssh -oConnectTimeout=10 -c blowfish -ax' source-file destserver:/dest/path

Advantage of rsync is that it can resume a broken transfer.

You could also try regular scp with:
Code:
 -c blowfish -C

[ -C is for compression ]

You could also try piping the large file through tar+gzip as described here: here and here

If you can download using HTTP, aget is another option.

All this may not make much of an impact if your file cannot be compressed, or if your network is too slow.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to copy very large directory trees

I have constant trouble with XCOPY/s for multi-gigabyte transfers. I need a utility like XCOPY/S that remembers where it left off if I reboot. Is there such a utility? How about a free utility (free as in free beer)? How about an md5sum sanity check too? I posted the above query in another... (3 Replies)
Discussion started by: siegfried
3 Replies

2. Red Hat

Issues restoring a large dump file

Post deleted. (0 Replies)
Discussion started by: Nobody_knows_me
0 Replies

3. Solaris

copy crash dump file

Hi gurus, I will be glad if anyone can help me with this: How do you copy a crash dump file to send to your support provider? Thanks lots guys. (1 Reply)
Discussion started by: cjashu
1 Replies

4. Shell Programming and Scripting

Split large zone file dump into multiple files

I have a large zone file dump that consists of ; DNS record for the adomain.com domain data1 data2 data3 data4 data5 CRLF CRLF CRLF ; DNS record for the anotherdomain.com domain data1 data2 data3 data4 data5 data6 CRLF (7 Replies)
Discussion started by: Bluemerlin
7 Replies

5. Solaris

How to safely copy full filesystems with large files (10Gb files)

Hello everyone. Need some help copying a filesystem. The situation is this: I have an oracle DB mounted on /u01 and need to copy it to /u02. /u01 is 500 Gb and /u02 is 300 Gb. The size used on /u01 is 187 Gb. This is running on solaris 9 and both filesystems are UFS. I have tried to do it using:... (14 Replies)
Discussion started by: dragonov7
14 Replies

6. Programming

Best way to dump metadata to file: when and by who?

Hi, my application (actually library) indexes a file of many GB producing tables (arrays of offset and length of the data indexed) for later reuse. The tables produced are pretty big too, so big that I ran out of memory in my process (3GB limit), when indexing more than 8GB of file or so.... (9 Replies)
Discussion started by: emitrax
9 Replies

7. Programming

How to use a core dump file

Hi All, May be it is a stupid question, but, I would like to know what is the advantage using a core dump file at the moment of debugging using gdb. I know a core dump has information about the state of the application when it crashed, but, what is the difference between debugging using the... (2 Replies)
Discussion started by: lagigliaivan
2 Replies

8. Shell Programming and Scripting

Importing dump file

Hi, I am trying to import 22 .dmp files but facing the problem with the last table file it never ends the import command, only the table is created but the rows of the table don't get imported. This is the problem with only ine table rest 21 tables are being imported properly. Thanks in... (2 Replies)
Discussion started by: anushilrai
2 Replies

9. Solaris

making copy of 0 level dump via ufsdump

Hi how do u make "copy" of o level dump taken via ufsdumo in solaris? To elaborate, imagine you have taken a 0 level dump via the following command ufsdump 0ulf /dev/rmt/1n / and then again execute the same command to take a second 0 level dump Now take an incremental dump ufsdump 1ulf... (2 Replies)
Discussion started by: vishalsngh
2 Replies

10. UNIX for Dummies Questions & Answers

help, what is the difference between core dump and panic dump?

help, what is the difference between core dump and panic dump? (1 Reply)
Discussion started by: aileen
1 Replies
Login or Register to Ask a Question