I currently have a perl script that have to retreive a single file from 20+ sites every 10 min. Right now it will ftp to site1 and download and continue up until site20. I am trying to get this to run all the ftp d/l at the same time. This is where I have my problem, I can't get it to work.
sub ftp_go_and_get {
if (lc($_[3]) eq "windows"){
$f_user = "fox\@openfox";
$directory = "SAM";
} elsif (lc($_[3]) eq "unix"){
$f_user = "fox";
$directory = "/home/fox/SAM";
}
$ftp=Net::FTP->new($_[1],Timeout=>60) or $newerr=1;
push @ERRORS, "Can't ftp to $_[1]($_[0]): $!\n" if $newerr;
myerr() if $newerr;
print "Connected to $_[1]($_[0])\n";
$ftp->login("$f_user","$_[2]") or $newerr=1;
print "Trying to get in and grab file.\n";
push @ERRORS, "Can't login to $_[1]($_[0]): $!\n" if $newerr;
# $ftp->quit if $newerr;
myerr() if $newerr;
print "Logged in\n";
$ftp->cwd("$directory") or $newerr=1;
push @ERRORS, "Can't find a spot to take dump.\n" if $newerr;
myerr() if $newerr;
# $ftp->quit if $newerr;
#$f_size=$ftp->size("@_[0]") or $newerr=1;
#print "The size of this file is $f_size\n";
# push @ERRORS, "Can't get size of file.\n" if $newerr;
# myerr() if $newerr;
# $ftp->quit if $newerr;
@files=$ftp->get($_[0], "$HOMEDIR/$_[0]") or $newerr=1;
push @ERRORS, "Can't get file $_[0]\n" if $newerr;
myerr() if $newerr;
print "Got file for $_[0]\n";
foreach(@files) {
print "I took a dump in $_\n";
}
$ftp->quit;
}
At the same time!? That may be like a giant bandwidth hog.. but oookay.
You may be able to do it like this..
Loop from 1 to 20... (numbers being 1..20)
Well, that, or you can learn to fork in perl.
Basically here is the logic. Loop through all of your sites, and execute the perl script with the argument of what site you want to ftp data to/from, and then throw it in the background to do its thing (so the loop continues). No guarantees it will work, but it should.
So you think I would have to have the subroutine in it's own script and just run it in the background? My main script has other functions but I can try to run this again calling the ftp portion &.
Do I even want to use threads in this then? I was just calling my subroutine `&ftp_go_and_get $ARG1 $ARG2 $ARG3 $ARG4` before but that did not give me the desired results either.
I have started to look at fork() this but have never myself done it.
As for bandwidth the total of all the files is under 40k and it is not really an issue for this.
So lets call the perl script that does all the ftp stuff, ftp.pl
In the ftp.pl , the script allows 1 argument, that is the site that is needing to be transfered to/from.
That is assuming that you have all of the hosts inside of a file called hosts.txt (that way you can just add append additional hosts to the bottom). Easy as pie. If you want, you can use rsync, scp, rcp..whatever in linux.
I do have a file where I have all the hosts IP PWD and OS type stored. I called it sam_sites.pl.
I read in the file into an array @site
I then have a foreach loop where I take each line and set the variables to host,IP,PWD and OS. Then I start the ftp transfer.
So I guess one of my questions still is will threads do what I I wanting with d/l the files at the same time or do I need to run a perl script in background or fork()?
Hi All,
Please help.
I have requirement to read the file / folder based on the latest date and download the file and folder.
There will be files and folders in the location like
20140630-144422
20140630-144422.csv
20140707-182653
20140707-182653.csv
20140710-183153... (7 Replies)
Need assistance
I have a script which i can download the files from ftp server using perl . But i want to download multiple files at a time
#!/usr/bin/perl -w
use Net::FTP;
$ftp = Net::FTP->new("ftp.ncdc.noaa.gov");
$ftp->login('username', 'password');
$ftp->cwd("<dir>");... (9 Replies)
Hi,
I have a problem while ftp'ing zip files one after the other from linux source directory to a ftp host machine. here is the shell script:
#!/bin/ksh
dir=data/dir1/dir2 # this dir is linux source directory where zip files located.
rmtdir='/home/'
echo $dir
for i in /$dir/*; do
if ;... (7 Replies)
I m interested in making multiple websites, all with the same basic functionality but will have different designs or templates. is there a way they can all run off the same basic underlying code, but use different templates. what is the best method of doing this. (1 Reply)
I am begginer to perl scripting, i like to learn all the functionality of the perl scrpting , Could you please help me on this :confused::confused: (2 Replies)
Hi All,
I am using ncftpput in one of my c-shell script to upload files to a remote location. The source files keep coming continuosly and to upload each file ncftpput opens a new connection everytime. It means ncftp uploads the file1 than file2 than file3 .... There is gap 20-25 secs between... (10 Replies)
Hello,
I am trying to download several files from a remote FTP server using Net::FTP from within a perl script. The files download alright, except they appear to be corrupted. I know this because once after downloading, I use bzcat to pipe the files to a split command, and bzcat complains saying... (3 Replies)
FROM WINDOWS , WHILE GETTING DATA FROM HP-UNIX SERVER USING FTP, I AM ABLE TO GET DATA ONLY LESS THAN 4GB FILE, PLZ HELP ME HOW TO GET MORE MORE THAN 4GB FILE. (4 Replies)
I receive files by FTP in an input directory. Every 10 seconds a process checks if some new file has arrived to the input directory and then the process moves them to another directory to be read by a second process.
The problem is that there is no way to know if the download has finished... (11 Replies)