I want to move a huge amount of data (i.e in gigs) from one remote server to another remote server.And i don't have telnet access.
Moreover i don't want to stay connected to internet all the time while the transfer is takin place.
Does anybody know a tool that can be useful to me ??
Any help in this regard would be sincerly appreciated..
Eagerly awaiting u r response
I would problably tar the directories up that contained the data I needed on the remote server to a diskfile and then compress the diskfile and see how big the file was. It may be that your compressed file is less thanr 1/2 the size of the origional file.
Let's say my data was in /usr2/data
I would use the following command:
tar -cvf mydata /usr2/data
This command will tar all the files together that are in /usr2/data and place them in a diskfile called mydata.
Then compress the file mydata
Transfer the file mydata.Z over to your other server either via ftp or rcp or copy the file off to tape or diskette and UPS or FED-EX the data to your other site.
Load the file mydata.Z onto your other server and place in a directory that has enough free space to accomodate the file uncompressed.
Uncompress the file
The extract the data from the file mydata
tar xvf mydata
This will place the data back into the directory /usr2/data on your remote server.
Aside from compressing the files down your best bet.. if you have to transfer that amount of data on a daily basis, it may warrant having an internet connection available full time.
livinfree i have ftp access on both the servers.
Was able to come up with www.nexfer.net
their tool is able to transfer files between 2 remote servers.
Moreover one can just queue up commands and disconnect