FTP Connection die out


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting FTP Connection die out
# 1  
Old 10-02-2010
FTP Connection die out

Hi,

I will ftp aroung 80 files after connecting to an FTP Server. But after 2 minutes of connection, it is timed out and connection is dying. Server had a 2 minute connection timeout if connection is idle. But my question, Isn't tranfering files not considered as an activity. Is the connection considered idle when files are transferred.

Also in the code, how can i make the connection active for more than 2 mintues. i wish to put the connection active until all my files are transferred. How can i do this. Please advise. Can somebody please respond to this.

---------- Post updated at 10:27 AM ---------- Previous update was at 12:14 AM ----------

Can somebdoy please respond to this.

Last edited by vasuarjula; 10-02-2010 at 11:26 AM..
# 2  
Old 10-02-2010
The problem is on the remote server. You know those numbers that appear when ftp does things? We need the number and message. I am guessing you are getting something like 426.

See List of FTP server return codes - Wikipedia, the free encyclopedia

If 426 is the case chat with the admin for the remote box. Or simply make one connection for each file, if the sysadmin cannot change it.
# 3  
Old 10-03-2010
Thanks for the reply.
But i still had a question. Is transferring the file to Remote server is not considered as an activity on remote server? We transfer files using PUT command. So isn't put command an activity on remote server.
# 4  
Old 10-03-2010
Are you able to post the Operating System of the remote server and the "ftpd" command line from the remote server?
If the remote "ftpd" server has parameter "-T 120" rather than "-t 120" this could cause the effect you see.
# 5  
Old 10-03-2010
FTP is so old, it uses two connections, even in passive more, so you may lose the control connection during the transfer of a long file or command. Consider moving the files in an zip, or using a newer protocol like ssh/scp, rcp that use one connection. scp has optional gzip compression and encryption for security at the cost of cpu, especially at the sending end. Both scp and rcp have a subtree recursion mode -r that packs everything in cpio internally. Sometime I send big files by multiple scp -C parallel commands, as the compression and encryption leave some net bandwidth unused. Sometimes, I send data compressed using the faster 'compress' (LZ not LZW) with ssh or rsh, like this pull:

$ rsh -n source_host "compress <remote_file" | uncompress >local_file

The network may not be slow enough to justify the slower gzip compression even with usually twice as small an output. Similarly, you can collect files using cpio as archiver to stdout and pipe that to a compression tool and then through rsh/ssh to the other end to the matching uncompression tool and a cpio to unarchive the stream. I used this up update a whole subtree in Hong Kong over a 56K WAN from NJ, back in the day. cpio has a do not write to files not older mode, too.
# 6  
Old 10-03-2010
As the O/P omitted to mention the Operating System(s), possible refinement is irrelevant.
There are established techniques for dealing with a connection break during a ftp transfer.
It would help if the O/P posted what was expected, what was typed, and what happened.
# 7  
Old 10-22-2010
Interesting thread..

So.. here I am on a Lenny box, not my usual fare, mind you, but it has it's upsides over my usual CentOS for a very specific application I use a bit, namely it WORKS! (ImageMagick).

Anyway.. that's a digression..

I need to send 20k files spread across a couple of directories..

wput a single file at a time kills the remote server's connection, command line ftp dies trying to send 524MB at about 73. The only way I can manage at the moment is a single file at a time, and don't start more than one.

Pretty straight forward, but if I fork bg, and let it fly, it flies, and all transfers die. The linear approach, using find to compile the list and a do loop to send them all doesn't seem to work either, I end up with skipped messages and no files on the remote server.

I need an intelligent uploader that's CLI based so I can fire it from 5 different places to 5 different places.. something that validates every file transfer and retries all night with a varying and maybe long window Smilie

Peter

Peter
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

FTP connection refused

Hi I am trying to execute a shell script which is in unix server gs1. The script is below which basically connects to another unix server q15 and tries to get a file using FTP . But i get error as "ftp: connect: Connection refused Not connected. Not connected." Please help with if the below... (8 Replies)
Discussion started by: samrat dutta
8 Replies

2. Debian

ftp connection refused

I have two computers running Debian 6.0 and one running Solaris 2.6 on a private network. The Debian computers can ftp to the Solaris computer but if a Debian computer is the destination the ftp connection is refused. I assume this is some security feature of Debian. What can I do to allow... (2 Replies)
Discussion started by: snorkack59
2 Replies

3. Solaris

Passwordless FTP connection

Hello, Can someone help me in getting a passwordless FTP conncetion ? It works if I do it as myself, but I need to be user1 when I do it, not myself. When I do it as user1, it asks for user1 password, which we don't have. All in all, ftp myserver.com works on all other servers when I'm... (5 Replies)
Discussion started by: aksijain
5 Replies

4. Solaris

Solaris 10 ftp connection problem (connection refused, connection timed out)

Hi everyone, I am hoping anyone of you could help me in this weird problem we have in 1 of our Solaris 10 servers. Lately, we have been having some ftp problems in this server. Though it can ping any server within the network, it seems that it can only ftp to a select few. For most servers, the... (4 Replies)
Discussion started by: labdakos
4 Replies

5. HP-UX

ftp first connection closed

Hi, Have anyone seen this problem, there is one remote side where their ftp connection to our server will always fail with connection closed by remote host and the second connection will be working. is this an OS issue or network issue? Thanks Robert (4 Replies)
Discussion started by: robertngo
4 Replies

6. UNIX for Dummies Questions & Answers

FTP Connection

Hi, Anyone encounter whereby when you FTP from an Unix server (Solaris 8) to another server (Window Server 2003), you tend to wait a long while when you do a "ls" and you get a timeout after that. However, when you FTP again and do a "ls", the result of "ls" shown immediately. All connection are... (5 Replies)
Discussion started by: ahlude
5 Replies

7. UNIX for Dummies Questions & Answers

ftp - Connection close

Hi Can someone help me what is the problem, when i try to login via ftp, though i entered the correct password, i got an error message Connection refused. please help. thanks (3 Replies)
Discussion started by: kaibiganmi
3 Replies

8. SCO

ftp connection

I have problem with ftp connection on SCO UNIX 5.0.7, it is work slowly. How I can make accelerate. (6 Replies)
Discussion started by: draganmi
6 Replies

9. Solaris

ftp connection problem

Hi all, i am not able to connect the ftp through userA. Can you please tell me what are the possible ways to troubleshoot this issue? Regards krishna (2 Replies)
Discussion started by: krishna176
2 Replies

10. Cybersecurity

FTP Connection

I just started a new job and inherited a Fedora core 3 server. I wanted to open the ftp ports and I wanted to verify this code before I mess things up. From my reading the two lines below should allow ftp access can anyone confirm my attempt. iptables -A INPUT -p tcp - sport 21 -m state - state... (0 Replies)
Discussion started by: mungaz
0 Replies
Login or Register to Ask a Question