shell script - ftp downloading serveral files without mget
Hello guys,
i'm searching for a solution how to download all files from root-directory of an ftp-server through an ftp proxy
getting through the ftp proxy and download one file with get ist no problem, but mget * does nothing!
Code:
ftp -n -i -v <<EOF
open proxyHost proxyPort
user user@ftp_server password
lcd xxxxxxxxxxxx
epsv4 // downloading only works in extended passive mode
/* now i want to download all files in the root directory of the ftp but mget * does not work (does nothing... not even throwing an error message) */
bye
EOF
i read that it is possible that the console hasn't enough memory to perform mget with an high amount of files!?
i'm thinking about store all filenames in a list and then downloading them with get in a loop (in a later version of the script i propably want to delete the files after downloading them) but i don't really know how to do this and found no example for downloading files in a loop from an ftp server (through ftp connection, because connection over http doesn't work)
#!/bin/bash
function send_ftp
{
ftp -n -i -v $HOST> $2 <<EOF
open proxyHost proxyPort
user USER PASS
cd $dir_remote
$1
bye
EOF
}
file_log=log.$$
file_list=tmp.$$
dir_remote=you data
#Put the max file that you can copy in one instrucction of mget
max_file=you data
#IP remote host
HOST=you data
#call to generate the list of files in remote host
send_ftp "dir $dir_remote" $file_log
#in my case de first 12lines is the "head" of the log file, besure in you case
#the grep filter is because the lines of ftp begin with number and the files no
sed "1,12 d" $file_log | grep -v ^[0-9] | sed "s/ */#/g"| cut -f9 -d"#" >$file_list
rm -f $file_log
#to optimice the ftp comands i make a buffer with the max number of files that we
#can pass to de mget instruction
buffer=""
conta=0
while read file
do
buffer="$buffer $file"
conta=$(expr $conta + 1)
if [ $max_file -eq $conta ]
then
send_ftp "mget $buffer" "/dev/null"
buffer=""
conta=0
fi
done <$file_list
if [ "$buffer" ]
then
send_ftp "mget $buffer" "/dev/null"
fi
rm -f $file_list
#!/bin/bash
function send_ftp
{
ftp -n -i -v <<EOF
open HOST PORT
user USER PASS
lcd XXXXXXXXXXXXX
$1
bye
EOF
}
file_log=log.$$
file_list=tmp.$$
max_file=10
#call to generate the list of files in remote host
send_ftp "dir" $file_log
#in my case de first 12lines is the "head" of the log file, besure in you case
#the grep filter is because the lines of ftp begin with number and the files no
sed "1,12 d" $file_log | grep -v ^[0-9] | sed "s/ */#/g"| cut -f9 -d >$file_list
rm -f $file_log
#to optimice the ftp comands i make a buffer with the max number of files that we
#can pass to de mget instruction
buffer=""
conta=0
while read file
do
buffer="$buffer $file"
conta=$(expr $conta + 1)
if [ $max_file -eq $conta ]
then
send_ftp "mget $buffer" "/dev/null"
buffer=""
conta=0
fi
done <$file_list
if [ "$buffer" ]
then
send_ftp "mget $buffer" "/dev/null"
fi
rm -f $file_list
I'm not very good at shell scripting
cannot really see through this
Quote:
ftpreu.sh: line 6: user: command not found
ftpreu.sh: line 7: lcd: command not found
ftpreu.sh: line 8: dir: command not found
ftpreu.sh: line 9: bye: command not found
ftpreu.sh: line 10: EOF: command not found
sed: log.328: No such file or directory
cut: option requires an argument -- d
usage: cut -b list [-n] [file ...]
cut -c list [file ...]
cut -f list [-s] [-d delim] [file ...]
i removed "#" at the cut cause # is for comment and with it i get syntax error: unexpected end of file
please copy the shell like i put.
the only thinks yo has to modifique is the valor of some variables:
Code:
dir_remote=you data
#Put the max file that you can copy in one instrucction of mget
max_file=you data
#IP remote host
HOST=you data
example if directory of the remote server is /tmp/mydir the remote server ip is 10.30.20.50 and you want copy 30files in every connection yo need to modify:
Code:
dir_remote=/tmp/mydir
#Put the max file that you can copy in one instrucction of mget
max_file=30
#IP remote host
HOST=10.30.20.50
And the strings USER y PASS you need to put de user and pass for the connection
Code:
ftp -n -i -v $HOST> $2 <<EOF
open proxyHost proxyPort
user USER PASS
cd $dir_remote
For example for user=user_ftp, pass=mypass
Code:
ftp -n -i -v $HOST> $2 <<EOF
open proxyHost proxyPort
user user_ftp mypass
cd $dir_remote
if I have understood you well, the problem that you have is that in the directory of the remote server where from you want to copy the files there are too many files. For this motive the judgment
mget *
returns mistake to you provided that the substitution of the character * is too long.
The option more easy could be to use get for each of the files, but it means to realize one connection ftp for every file.
For this motive is variable max_file, indicates the maximum number to bring in every connection being able to optimize the number of connections.
Let's suppose the following situation:
We want to copy the files of the directory /tmp/midirectorio and in this directory(board of directors) there are 200 files, if we define max_file=50, when you execute the script it will make 4 connections ftp and in every connection copy 50 files.
if i put the address of the ftp_server to the HOST variable, the connection runs to our firewall....
if i put the address of the proxy to the HOST variable, i get: "ftp: connect: Operation timed out"
i don't know what $HOST> &2 in the ftp command really does???
but to understand i need to connect through the proxy
open proxyHost proxyPort
and with username: accountname@ftp_server_address
something like: user_ftp@10.30.20.50
and password is password...
i tried like this but it doesn't work with your script
ftp -n -i -v <<EOF
open proxyHost proxyPort
user user@ftp_server password
/* now i want to download all files in the root directory of the ftp but mget * does not work (does nothing... not even throwing an error message) */
bye
EOF
the equivalente in the function is:
Code:
function send_ftp
{
ftp -n -i -v > $2 <<EOF
open proxyHost proxyPort
user user@$HOST password
cd $dir_remote
$1
bye
EOF
}
Quote:
Originally Posted by macProgger23
don't know what $HOST> &2 in the ftp command really does???
&2 is not in my script, my script:
ftp -n -i -v $HOME> $2
HOME is a variable, and when you put $HOME the system change the string $HOME by the value, then :
ftp -n -i -v 10.10.10.10 > $2
here the character > is the redirection of the stdout for the ftp, $2 is the second parameter to call the function
If you look to the calls :
send_ftp "dir" $file_log
in this case $file_log is the second parameter in the call
if you don't wnat to use variables you can put the valors, i wrote the scrpt withc variables because i think is better.
Hey All,
first post :rolleyes:
So I am writting a script to pull down files from an ftp that will be called from a bat file on windows. This seems pretty straight forward, and grabs all of the "files" in the cd location, but I am running into some permission issue that will not allow me to... (1 Reply)
Hi! I am new to unix and this forum as well..
Can someone please help me :
I want to "get/mget" files which are older than 10 minutes from a remote FTP server like "ftp.com".
After getting the files to local unix server say "Prod.com" , i need to delete only those files from ftp.com which... (4 Replies)
Hi every one,
I have the requirement to download the files from FTP and move those files to unix box. Once after coping the files, i need to remove the files in FTP.
I'm a newbie in Unix script. Can you please suggest a script for this.. Thanks in advance.. (2 Replies)
Hi,
i use the below script to send a single file to remote server from linux.
ftp -nvi <<!EOF
open $Host_name
user $USER_ID $PWD
binary
mput $file_name
quit
!EOF (where i... (2 Replies)
Hi calling a batch file with FTP commands from Visual basing with Shell, it basically to copy files then delete them.
The delete seems to work but the copy, it looks like the mget command doesn't have the complete information on where to copy the files.
Here is the batch file content.
Open... (2 Replies)
I have a automated FTP script that gets a file using mget. I am using mget because the date will change on the file frequently. The mget works, however if I incorrectly type the file (e.g. if I want to get /dog123 and I enter /dg*) I do not receive and error code from in the FTP session. The... (1 Reply)
The following script is used to loop through files in the /tmp directory and transfer those files onto another server.
However, some of the files do not transfer. It is very random when the transferring is done (i.e. one of the files won't transfer then next time, that one will transfer and... (1 Reply)
I have a shell script where I am trying to ftp some files but I get the error message "EOF unclosed" every time the script reaches the ftp section. Here is how my script is written.
#more code up here
rm -f $object >> $LOG_FILE 2>&1
fi #end of if
done #end of for loop
... (5 Replies)