Large file transfer problem


 
Thread Tools Search this Thread
Top Forums UNIX for Advanced & Expert Users Large file transfer problem
# 1  
Old 10-22-2005
Java Large file transfer problem

Hello Everyone,

I can't transfer a large file (~15GB TAR Archive) from one linux machine to another via FTP.

I have tried the following:

1) Normal FTP the whole 15GB. This stops when it gets to about 2GB and doesn't go any further.

2) Split the 15GB file into 500MB pieces using the split command and then transfer all the pieces over one by one. This works but when it's time to join them all back up again into the original 15GB file, using cat, I get an error saying that the file is too large (this error happens when the file gets to a size of about 2GB).

I have checked things like quotas, disk space, permissions, etc.

Any ideas?

Cheers
# 2  
Old 10-22-2005
Make sure that the filesystem where you are trying to transfer/combine the file is largefile enabled. Large files are files that are over 2GB in size.

The 'mkfs -m' command will show you what options were used to create the filesystem, though I do not know how exactly largefile support is handled in Linux.
Login or Register to Ask a Question

Previous Thread | Next Thread

8 More Discussions You Might Find Interesting

1. AIX

Problem using scp to transfer a file

I am testing the following command to transfer a file from my server (AIX 5.2) to another server. I was able to generate the keys and sent them the public key. scp -v -P 4030 /home/lawson/.ssh/jimtest.txt someuser@some.ftpsite.net:/Inbound/jimtest.txt > jimtest_out.txt 2>&1 Based on... (3 Replies)
Discussion started by: jyoung
3 Replies

2. UNIX for Advanced & Expert Users

problem while doing Large file transfer thru Scp and FTP

Hi , I want to transfer one file having 6GB(after compression) which is in .cpk format from one server to other server. I tried scp command as well as FTP and also split the file then transfer the files thru scp command. At last i am facing the data lost and connection lost issue. Generally it... (2 Replies)
Discussion started by: Sumit sarangi
2 Replies

3. UNIX for Dummies Questions & Answers

Large file problem

I have a large file, around 570 gb that I want to copy to tape. However, my tape drive will load only up to 500 gb. I don't have enough space on disk to compress it before copying to tape. Can I compress and tar to tape in one command without writing a compressed disk file? Any suggestions... (8 Replies)
Discussion started by: iancrozier
8 Replies

4. Programming

Problem in file transfer using sockets

Hai Friends I am writing a c program to transfer files from one system to another using TCP/IP socket programming.. My Recieve Program #include <stdio.h> #include <sys/types.h> #include <sys/socket.h> #include <netinet/in.h> extern int errno; extern char *sys_erlist; void... (2 Replies)
Discussion started by: collins
2 Replies

5. UNIX for Dummies Questions & Answers

Have problem transfer large file bigger 1GB

Hi folks, I have a big problem.... and need help from your experience/knowledge. I previously install and use FREEBSD 7.0 release on my storage/backup file server, for some reason, I can not transfer any files that is bigger than 1GB. If I transfer it to Freebsd file server, the system... (2 Replies)
Discussion started by: bsdme2
2 Replies

6. UNIX for Advanced & Expert Users

Large file FTP problem

We are experiencing a problem on a lengthy data transfer by FTP through a firewall. Since there are two ports in use on a ftp transfer (data and control), one sits idle while the other's transfering data. The idle port (control) will get timed out and the data transfer won't know that it's... (3 Replies)
Discussion started by: rprajendran
3 Replies

7. Shell Programming and Scripting

Problem in processing a very large file.

Hi Friends, Getting an error while processing a very large file using an sqlloader........ The file is larger than 2 GB. Now need to change the compiler to 64-bit so that the file can be processed. Is there any command for the same. Thanks in advance. (1 Reply)
Discussion started by: Rohini Vijay
1 Replies

8. UNIX for Advanced & Expert Users

large file transfer

I have 130GB file to transfer to another machine. What is the best way to make sure the file succesfully transfered? I'm using Solaris & I'm thinking to use scp but afraid if anything happen, I've to scp it again. From my calculation it going to take 1-2 days to complete. Is scp supports resume? Or... (1 Reply)
Discussion started by: ashterix
1 Replies
Login or Register to Ask a Question