So.. here I am on a Lenny box, not my usual fare, mind you, but it has it's upsides over my usual CentOS for a very specific application I use a bit, namely it WORKS! (ImageMagick).
Anyway.. that's a digression..
I need to send 20k files spread across a couple of directories..
wput a single file at a time kills the remote server's connection, command line ftp dies trying to send 524MB at about 73. The only way I can manage at the moment is a single file at a time, and don't start more than one.
Pretty straight forward, but if I fork bg, and let it fly, it flies, and all transfers die. The linear approach, using find to compile the list and a do loop to send them all doesn't seem to work either, I end up with skipped messages and no files on the remote server.
I need an intelligent uploader that's CLI based so I can fire it from 5 different places to 5 different places.. something that validates every file transfer and retries all night with a varying and maybe long window
Peter
Peter