Same time ftp download in perl multiple sites. | Unix Linux Forums | Shell Programming and Scripting

  Go Back    


Shell Programming and Scripting Post questions about KSH, CSH, SH, BASH, PERL, PHP, SED, AWK and OTHER shell scripts and shell scripting languages here.

Same time ftp download in perl multiple sites.

Shell Programming and Scripting


Tags
fork, ftp, perl, threads

Closed Thread    
 
Thread Tools Search this Thread Display Modes
    #1  
Old 01-16-2009
kofs79 kofs79 is offline
Registered User
 
Join Date: Aug 2008
Last Activity: 4 December 2009, 10:29 AM EST
Posts: 5
Thanks: 0
Thanked 0 Times in 0 Posts
Question Same time ftp download in perl multiple sites.

I currently have a perl script that have to retreive a single file from 20+ sites every 10 min. Right now it will ftp to site1 and download and continue up until site20. I am trying to get this to run all the ftp d/l at the same time. This is where I have my problem, I can't get it to work.

Sample of sam_sites.pl file :

AA|123.456.789.100|PASSWORD|Windows
BB|321.654.987.100|PASSWORD|Unix
CC|135.790.864.123|PASSWORD|Unix


Code from ret_sam.pl file :

#!/usr/bin/perl
use Net::FTP;
use Switch;
use threads;
use threads::shared;

$version = "0.00.0001";
$RETSAMC = `ps -ef | grep ret_sam | grep -v grep | wc -l`;
$RETSAMS = "/home/fox/SAM/RET_SAM_RUN";
$HOMEDIR = "/home/fox/SAM/sites/test";

open(SITES_F, "sam_sites.pl") || die ("Missing sam_sites.pl file!");
@site=<SITES_F>;
close(SITES_F);
$NLETNUM = scalar(@site);

foreach $location_n (@site)
{
chomp($location_n);
($s_name,$s_ip,$s_pwd,$s_sys)=split(/\|/,$location_n);
$thr = threads->create(\&ftp_go_and_get, $s_name, $s_ip, $s_pwd, $s_sys);
$thr->join();
}

sub ftp_go_and_get {
if (lc($_[3]) eq "windows"){
$f_user = "fox\@openfox";
$directory = "SAM";
} elsif (lc($_[3]) eq "unix"){
$f_user = "fox";
$directory = "/home/fox/SAM";
}
$ftp=Net::FTP->new($_[1],Timeout=>60) or $newerr=1;
push @ERRORS, "Can't ftp to $_[1]($_[0]): $!\n" if $newerr;
myerr() if $newerr;
print "Connected to $_[1]($_[0])\n";
$ftp->login("$f_user","$_[2]") or $newerr=1;
print "Trying to get in and grab file.\n";
push @ERRORS, "Can't login to $_[1]($_[0]): $!\n" if $newerr;
# $ftp->quit if $newerr;
myerr() if $newerr;
print "Logged in\n";
$ftp->cwd("$directory") or $newerr=1;
push @ERRORS, "Can't find a spot to take dump.\n" if $newerr;
myerr() if $newerr;
# $ftp->quit if $newerr;
#$f_size=$ftp->size("@_[0]") or $newerr=1;
#print "The size of this file is $f_size\n";
# push @ERRORS, "Can't get size of file.\n" if $newerr;
# myerr() if $newerr;
# $ftp->quit if $newerr;
@files=$ftp->get($_[0], "$HOMEDIR/$_[0]") or $newerr=1;
push @ERRORS, "Can't get file $_[0]\n" if $newerr;
myerr() if $newerr;
print "Got file for $_[0]\n";
foreach(@files) {
print "I took a dump in $_\n";
}
$ftp->quit;
}

sub myerr {
print "Error: \n";
print @ERRORS;
exit 0;
}


I am looking for any suggestions or constructive critism that can help me out here. Thanks in advance.
Sponsored Links
    #2  
Old 01-16-2009
Rhije Rhije is offline
Registered User
 
Join Date: Dec 2008
Last Activity: 2 August 2010, 6:18 PM EDT
Posts: 103
Thanks: 0
Thanked 0 Times in 0 Posts
At the same time!? That may be like a giant bandwidth hog.. but oookay.

You may be able to do it like this..

Loop from 1 to 20... (numbers being 1..20)


Code:
for number in numbers; do
    var="site"$number
    perl somescript.pl $var &
done

Well, that, or you can learn to fork in perl.

Basically here is the logic. Loop through all of your sites, and execute the perl script with the argument of what site you want to ftp data to/from, and then throw it in the background to do its thing (so the loop continues). No guarantees it will work, but it should.
Sponsored Links
    #3  
Old 01-16-2009
kofs79 kofs79 is offline
Registered User
 
Join Date: Aug 2008
Last Activity: 4 December 2009, 10:29 AM EST
Posts: 5
Thanks: 0
Thanked 0 Times in 0 Posts
So you think I would have to have the subroutine in it's own script and just run it in the background? My main script has other functions but I can try to run this again calling the ftp portion &.

Do I even want to use threads in this then? I was just calling my subroutine `&ftp_go_and_get $ARG1 $ARG2 $ARG3 $ARG4` before but that did not give me the desired results either.

I have started to look at fork() this but have never myself done it.

As for bandwidth the total of all the files is under 40k and it is not really an issue for this.
    #4  
Old 01-16-2009
Rhije Rhije is offline
Registered User
 
Join Date: Dec 2008
Last Activity: 2 August 2010, 6:18 PM EDT
Posts: 103
Thanks: 0
Thanked 0 Times in 0 Posts
Lets see..

So lets call the perl script that does all the ftp stuff, ftp.pl

In the ftp.pl , the script allows 1 argument, that is the site that is needing to be transfered to/from.


Code:
hostfile="hosts.txt"

for host in `cat $hostfile`; do
     perl ftp.pl $host &
done

That is assuming that you have all of the hosts inside of a file called hosts.txt (that way you can just add append additional hosts to the bottom). Easy as pie. If you want, you can use rsync, scp, rcp..whatever in linux.
Sponsored Links
    #5  
Old 01-16-2009
kofs79 kofs79 is offline
Registered User
 
Join Date: Aug 2008
Last Activity: 4 December 2009, 10:29 AM EST
Posts: 5
Thanks: 0
Thanked 0 Times in 0 Posts
I do have a file where I have all the hosts IP PWD and OS type stored. I called it sam_sites.pl.

Code:
AA|123.456.789.100|PASSWORD|Windows
BB|321.654.987.100|PASSWORD|Unix
CC|135.790.864.123|PASSWORD|Unix


I read in the file into an array @site


Code:
open(SITES_F, "sam_sites.pl") || die ("Missing sam_sites.pl file!");
@site=<SITES_F>;
close(SITES_F);
$NLETNUM = scalar(@site);

I then have a foreach loop where I take each line and set the variables to host,IP,PWD and OS. Then I start the ftp transfer.

Code:
foreach $location_n (@site)
{
chomp($location_n);
($s_name,$s_ip,$s_pwd,$s_sys)=split(/\|/,$location_n);
$thr = threads->create(\&ftp_go_and_get, $s_name, $s_ip, $s_pwd, $s_sys);
$thr->join();
}

So I guess one of my questions still is will threads do what I I wanting with d/l the files at the same time or do I need to run a perl script in background or fork()?

I appreciate your help.
Sponsored Links
    #6  
Old 01-16-2009
Rhije Rhije is offline
Registered User
 
Join Date: Dec 2008
Last Activity: 2 August 2010, 6:18 PM EDT
Posts: 103
Thanks: 0
Thanked 0 Times in 0 Posts
hrm, I'm not really sure. I am not the biggest perl guru, so that is why I suggested the background method.
Sponsored Links
Closed Thread

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

More UNIX and Linux Forum Topics You Might Find Helpful
Thread Thread Starter Forum Replies Last Post
how to run multiple process at the same time lestat_ecuador Shell Programming and Scripting 2 06-24-2008 04:01 PM
Download files using perl mirusnet UNIX for Dummies Questions & Answers 2 12-19-2007 09:04 PM
Copying multiple directories at the same time using Unix JPigford UNIX for Dummies Questions & Answers 9 01-17-2005 04:16 PM
free download indigo perl for windows zeoous Shell Programming and Scripting 2 12-24-2001 11:54 AM



All times are GMT -4. The time now is 07:27 PM.