Recursive FTP -- here at last.


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Recursive FTP -- here at last.
# 50  
Old 12-03-2008
Possible bug

So I tried using this script on some very large files (> 10GB) and it seems to stall using it with the following flags:
Code:
HardFeed.ksh -vrsf -l "ls -al" 10.11.108.65 ads /ifs/storage1/content/mgm.com

I have all the desired directories that I want files copied from precreated. Initially it looks like Hardfeed's timeout value is too small:
Code:
# ../../HardFeed.ksh -vrsf -l "ls -al" 10.11.108.65 ads /ifs/storage1/content/mgm.com
password -
/ifs/storage1/content/mgm.com/MGMI1000000000000100 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000106 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000109 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000109/ADI.XML.UTF-8 is a remote file that has been retrieved
/ifs/storage1/content/mgm.com/MGMI1000000000000109/FORYOUREYESONLY1008HD.mpg is a remote file that has been retrieved
/ifs/storage1/content/mgm.com/MGMI1000000000000118 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000124 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000130 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000136 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000139 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000211 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1770000000000103 is a remote directory that has been ignored
FATAL ERROR: timed out waiting for:
             /tmp/HardFeed.ok.302.2

So I tried increasing the timeout value:
Code:
#  The following 15 seconds is far too small, changing to larger timeout
#OPT_MAXWAIT=15
#  1 hour timeout - 1 hours * 60min * 60 seconds = 3600
OPT_MAXWAIT=3600

Which just causes it to take longer to fail:
Code:
# ../../HardFeed.ksh -vrsf -l "ls -al" 10.11.108.65 ads /ifs/storage1/content/mgm.com
password -
/ifs/storage1/content/mgm.com/MGMI1000000000000100 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000106 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000109 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000109/ADI.XML.UTF-8 is a remote file that already exists and is current
/ifs/storage1/content/mgm.com/MGMI1000000000000109/FORYOUREYESONLY1008HD.mpg is a remote file that already exists and is current
/ifs/storage1/content/mgm.com/MGMI1000000000000118 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000124 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000130 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000136 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000139 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1000000000000211 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1770000000000103 is a remote directory that has been ignored
/ifs/storage1/content/mgm.com/MGMI1770000000000103/ADI.XML.UTF-8 is a remote file that has been retrieved
/ifs/storage1/content/mgm.com/MGMI1770000000000103/DIAMONDSAREFOREVER1008HDrev.mpg is a remote file that has been retrieved
/ifs/storage1/content/mgm.com/MGMI1770000000000121 is a remote directory that has been ignored
FATAL ERROR: timed out waiting for:
             /tmp/HardFeed.ok.460.2

Hardfeed verifies the file exists but does not wait for the current FTP transfer to finish before moving on to the next one. This isn't a problem for small files as they finish quickly; however large files seem to cause a problem in the logic of the program and it hangs until the timeout.

Has anyone else seen this?
# 51  
Old 12-03-2008
Very few people transfer 10 GB files over the network, so I doubt that anyone else has encountered your problem. Most of us split us very large files are transfer the pieces. If that is not possible and you have only one or two 10 GB files, allow HardFeed to fail, and transfer those files manually using a client with the "reget" command.

Or you can set the timeout to a very large number: 86,400 is one full day.
# 52  
Old 12-03-2008
I guess I forgot to mention that Hardfeed times out well after the 10GB file finishes transfering. So Hardfeed never picks up the next transfer after the first one completes. Eventually the timeout is reached and the script aborts but this is well after I stop seeing the size of the file increase. Once it aborts I compair the file sizes and they are identical.
# 53  
Old 03-25-2009
Thank you!
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. OS X (Apple)

Search recursive

before posting, I have tried to find my answer elsewhere. no luck. I need to find a file buried in a folder somewhere. Master folder has 10 sub folders. each sub folder has folders too. I found this but it does nothing I am on Mac and use Applescript. do shell script "find... (2 Replies)
Discussion started by: sbrady
2 Replies

2. UNIX for Dummies Questions & Answers

recursive search and ftp

Could someone help me in recursive search and ftp'ing the files to remote server? The host machine will have /dir1/dira/list_of_files1 /dir1/dirb/list_of_files2 /dir1/dirc/list_of_files3 . . . so., I need to search from dir1 recursively (only one level down) and find all the files that... (1 Reply)
Discussion started by: brahmi
1 Replies

3. UNIX for Dummies Questions & Answers

Recursive Permissions???

Is there anyway that I can change permissions on a directory and all its sub-directories and files using one single "chmod" command?? (5 Replies)
Discussion started by: the_red_dove
5 Replies

4. UNIX for Dummies Questions & Answers

recursive wc on a directory?

Hi all, I need to count the number of lines in all the files under a directory (several levels deep). I am feeling extremely dumb, but I don't know how to do that. Needless to say, I am not a shell script wiz... Any advice? thanks in advance! (13 Replies)
Discussion started by: bimba17
13 Replies

5. Cybersecurity

Recursive SFTP

Hello, I need to transfer files from Serve1 to Server2. Previously I was using scp command. Now I have to use sftp (due to audit issues). The problem with sftp is (atleast to my level of knowledge) we cannot transfer dirs (and files within that dir). Is there a way to solve this? Looks like... (1 Reply)
Discussion started by: MohanTJ
1 Replies

6. UNIX for Advanced & Expert Users

recursive sorting

In the ls command, -t option and -R option dont work simultaneously. ls -t ---> lists the files with sorting based on file date ls -R ---> lists the files recursively. How to make utilize both in the same command.? I want to sort the recursive files listing.. (1 Reply)
Discussion started by: fermisoft
1 Replies

7. Shell Programming and Scripting

recursive rcp

I wrote a shell script (AIX) to extract the file "/rep1/toto" from all the hosts referred in a list and send them to one local directory named ~/$host-$file with the hostname as prefix rcp -p user@host:/rep1/$file ~/$host-$file where file = toto ==> it works ! I would do the same thing... (6 Replies)
Discussion started by: Nicol
6 Replies

8. Shell Programming and Scripting

perl + Net::FTP::Recursive

Problem: It will not advance to the next user in the list. It always dies right after it sends the 2/2 files from the first users dir. $USERLIST="/export/home/mxdooley/perl_ftp/userlist"; $USER_DIR="/export/home/mxdooley/perl_ftp/homes";... (2 Replies)
Discussion started by: Optimus_P
2 Replies

9. UNIX for Dummies Questions & Answers

recursive effect!!

I run the following command in some of my folders... and ended up with a huge mess!! find . -type f -exec perl -e 's/blabla/zzzxxxx/gi' -p -i.bak {} \; I had to kill the process and later when I checked with one of my folders.. ls vaditerm.dt.bak vaditerm.dt.bak.bak... (2 Replies)
Discussion started by: sskb
2 Replies

10. UNIX for Dummies Questions & Answers

Recursive FTP

I am trying to write a recursive FTP script and have come to a point where I need to test if the file is either a normal ascii file or a directory. My question is how do I test if the file is either ascii or directory. (1 Reply)
Discussion started by: aslamg
1 Replies
Login or Register to Ask a Question