I have an automated FTP script that is giving me grief. The script runs a ftp sweep for one of our accounts. It gets about 90 files then hangs. When i run ftp manually from cli everything runs great and I pickup the thousands of files waiting.
I used to run lftp, but it has been behaving even worse. Here is a snipet from my shell script that runs the automated pickup.
I've run this on both FreeBSD 6.2 and Gentoo 2007.0. Both with the same results. Termination and loop back to trying to login again after 90 files or so, but then I get the User already logged in message.
I thought maybe this was firewall related, but since I can run ftp from bash and have no problems, I'm lost.
Please if anyone knows any special voodoo, I would greatly appreciate it Does FTP have a file limit (count or size) running from cron? If anyone can help and you need more info please let me know. THANKS!
user x has a cron job that looks in a dir and moves teh files from 1 name to another except its not working correctly.
. /user/.profile # sorce the users profile
for file in `ls`; do
mv $file $file.`date +%Y%m%d%H%M%S``microsec`
done
microsec is a binary with 555 perm. on it in... (5 Replies)
I am writing a script that will be placed in a crontab, which basically retrieves a file that is overwritten everyday (with current customer data) and then sent to another company. I was wondering what the best way to ensure that ftp was successful and notification of a success or failure message.... (3 Replies)
I am on AS3 Update 4 Linux
and am having an issue with an automated ftp script, I tried using the fd/sub proc method and that did not seem to work either. I normally use the following method to perform my ftp's but for some reason it works if I launch the script at the command line but in Cron it... (4 Replies)
I am new to cron and am trying to set a cron job that will run everyday at 2 to go out to a server (via ftp with user name and password) and get a file and then bring it back to my sever. Below is what I have but it doesn't seem to be working. Any help would be appreciated.
cron command - will... (4 Replies)
We have a script that we run manually, we want to set it up so that it will run automatically via cron, the problem is during the ftp process nothing is put, the script logs in changes directory fine, but nothing is put to the directory. I have tried it as root, a power user, etc. I have made the... (4 Replies)
Hey guys,
I have a script called my_test:
# !/usr/bin/sh
`touch /usr/test/me`
The script has been saved in /usr/test/my_test
I have executed chmod 755 /usr/test/my_test
Then I have entered it into the cron file using crontab -e
0,5,10,15,20,25,30,35,40,45,50,55 * * * * /usr/test/my_test... (2 Replies)
When I list whats in cron -l its fine but when I try to -e edit it...it returns a number 309 can't you not edit cron this way with solaris 10? I can do it fine in sol 8 and 9.
export EDITOR="vi" is set in my profile
I am using BASH
$ sudo crontab -l
Password:
#ident "@(#)root ... (5 Replies)
Hello,
Having and issue with a job scheduled in cron. The script:
#!/bin/bash
2
3 # Example shell script which can be added to roots cron job to check the
4 # Embedded Satellite disk space usage. If any table is over 90% usage, send
5 # a notice to the default email address... (2 Replies)
Hello, I am running Solaris 8. I have set a cron job that runs every couple hours. If I run the script manually, it runs just fine (logged in as root). The cron however will not run. It is producing an rc=1 error. Any thoughts would be appreciated. Thanks (4 Replies)
i have written a script to sftp yesterday's logs from another server as below:
cd /export/home/abc/xyz/tt
d=`gdate -d'yesterday' +%Y%m%d`
sftp abc@XXX.XX.XX.XX<<EOF
cd /yyy/logs/archive
mget abc.log.$d*
EOF
cd /export/home/abc/xyz/scripts
nohup ./ss.sh PROD &
it is working fine... (2 Replies)
Discussion started by: ssk250
2 Replies
LEARN ABOUT SUSE
simpleftp
SIMPLEFTP(1) InterNetNews Documentation SIMPLEFTP(1)NAME
simpleftp - Rudimentary FTP client
SYNOPSIS
simpleftp url [...]
DESCRIPTION
simpleftp is a Perl script that provides basic support for fetching files with FTP in a batch oriented fashion. It takes one or more FTP
URLs on the command line. The file(s) will be retrieved from the remote server and placed in the current directory with the same basename
as on the remote; e.g., <ftp://ftp.isc.org/pub/usenet/CONFIG/active.gz> is stored as active.gz in the current directory.
The script properly understands usernames, passwords and ports specified as follows:
ftp://user:password@host:port/path/file
BUGS
simpleftp is an extremely poor substitute for more complete programs like the freely available wget or ncftp utilities. It was written
only to provide elementary support in INN for non-interactive fetching of the files in <ftp://ftp.isc.org/pub/pgpcontrol/> or
<ftp://ftp.isc.org/pub/usenet/CONFIG/> without requiring administrators to install yet another package. Its shortcomings as a general
purpose program are too numerous to mention, but one that stands out is that downloaded files by simpleftp override existing files with the
same name in the local directory.
HISTORY
Tossed off by David C Lawrence <tale@isc.org> for InterNetNews. Rewritten to use "Net::FTP" by Julien Elie.
$Id: simpleftp.pod 8357 2009-02-27 17:56:00Z iulius $
SEE ALSO actsync(8).
INN 2.5.2 2009-05-21 SIMPLEFTP(1)