Same time ftp download in perl multiple sites.


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Same time ftp download in perl multiple sites.
# 1  
Old 01-16-2009
Question Same time ftp download in perl multiple sites.

I currently have a perl script that have to retreive a single file from 20+ sites every 10 min. Right now it will ftp to site1 and download and continue up until site20. I am trying to get this to run all the ftp d/l at the same time. This is where I have my problem, I can't get it to work.

Sample of sam_sites.pl file :

AA|123.456.789.100|PASSWORD|Windows
BB|321.654.987.100|PASSWORD|Unix
CC|135.790.864.123|PASSWORD|Unix


Code from ret_sam.pl file :

#!/usr/bin/perl
use Net::FTP;
use Switch;
use threads;
use threads::shared;

$version = "0.00.0001";
$RETSAMC = `ps -ef | grep ret_sam | grep -v grep | wc -l`;
$RETSAMS = "/home/fox/SAM/RET_SAM_RUN";
$HOMEDIR = "/home/fox/SAM/sites/test";

open(SITES_F, "sam_sites.pl") || die ("Missing sam_sites.pl file!");
@site=<SITES_F>;
close(SITES_F);
$NLETNUM = scalar(@site);

foreach $location_n (@site)
{
chomp($location_n);
($s_name,$s_ip,$s_pwd,$s_sys)=split(/\|/,$location_n);
$thr = threads->create(\&ftp_go_and_get, $s_name, $s_ip, $s_pwd, $s_sys);
$thr->join();
}

sub ftp_go_and_get {
if (lc($_[3]) eq "windows"){
$f_user = "fox\@openfox";
$directory = "SAM";
} elsif (lc($_[3]) eq "unix"){
$f_user = "fox";
$directory = "/home/fox/SAM";
}
$ftp=Net::FTP->new($_[1],Timeout=>60) or $newerr=1;
push @ERRORS, "Can't ftp to $_[1]($_[0]): $!\n" if $newerr;
myerr() if $newerr;
print "Connected to $_[1]($_[0])\n";
$ftp->login("$f_user","$_[2]") or $newerr=1;
print "Trying to get in and grab file.\n";
push @ERRORS, "Can't login to $_[1]($_[0]): $!\n" if $newerr;
# $ftp->quit if $newerr;
myerr() if $newerr;
print "Logged in\n";
$ftp->cwd("$directory") or $newerr=1;
push @ERRORS, "Can't find a spot to take dump.\n" if $newerr;
myerr() if $newerr;
# $ftp->quit if $newerr;
#$f_size=$ftp->size("@_[0]") or $newerr=1;
#print "The size of this file is $f_size\n";
# push @ERRORS, "Can't get size of file.\n" if $newerr;
# myerr() if $newerr;
# $ftp->quit if $newerr;
@files=$ftp->get($_[0], "$HOMEDIR/$_[0]") or $newerr=1;
push @ERRORS, "Can't get file $_[0]\n" if $newerr;
myerr() if $newerr;
print "Got file for $_[0]\n";
foreach(@files) {
print "I took a dump in $_\n";
}
$ftp->quit;
}

sub myerr {
print "Error: \n";
print @ERRORS;
exit 0;
}


I am looking for any suggestions or constructive critism that can help me out here. Thanks in advance.
# 2  
Old 01-16-2009
At the same time!? That may be like a giant bandwidth hog.. but oookay.

You may be able to do it like this..

Loop from 1 to 20... (numbers being 1..20)

Code:
for number in numbers; do
    var="site"$number
    perl somescript.pl $var &
done

Well, that, or you can learn to fork in perl.

Basically here is the logic. Loop through all of your sites, and execute the perl script with the argument of what site you want to ftp data to/from, and then throw it in the background to do its thing (so the loop continues). No guarantees it will work, but it should.
# 3  
Old 01-16-2009
So you think I would have to have the subroutine in it's own script and just run it in the background? My main script has other functions but I can try to run this again calling the ftp portion &.

Do I even want to use threads in this then? I was just calling my subroutine `&ftp_go_and_get $ARG1 $ARG2 $ARG3 $ARG4` before but that did not give me the desired results either.

I have started to look at fork() this but have never myself done it.

As for bandwidth the total of all the files is under 40k and it is not really an issue for this.
# 4  
Old 01-16-2009
Lets see..

So lets call the perl script that does all the ftp stuff, ftp.pl

In the ftp.pl , the script allows 1 argument, that is the site that is needing to be transfered to/from.

Code:
hostfile="hosts.txt"

for host in `cat $hostfile`; do
     perl ftp.pl $host &
done

That is assuming that you have all of the hosts inside of a file called hosts.txt (that way you can just add append additional hosts to the bottom). Easy as pie. If you want, you can use rsync, scp, rcp..whatever in linux.
# 5  
Old 01-16-2009
I do have a file where I have all the hosts IP PWD and OS type stored. I called it sam_sites.pl.
Code:
AA|123.456.789.100|PASSWORD|Windows
BB|321.654.987.100|PASSWORD|Unix
CC|135.790.864.123|PASSWORD|Unix


I read in the file into an array @site

Code:
open(SITES_F, "sam_sites.pl") || die ("Missing sam_sites.pl file!");
@site=<SITES_F>;
close(SITES_F);
$NLETNUM = scalar(@site);

I then have a foreach loop where I take each line and set the variables to host,IP,PWD and OS. Then I start the ftp transfer.
Code:
foreach $location_n (@site)
{
chomp($location_n);
($s_name,$s_ip,$s_pwd,$s_sys)=split(/\|/,$location_n);
$thr = threads->create(\&ftp_go_and_get, $s_name, $s_ip, $s_pwd, $s_sys);
$thr->join();
}

So I guess one of my questions still is will threads do what I I wanting with d/l the files at the same time or do I need to run a perl script in background or fork()?

I appreciate your help.
# 6  
Old 01-16-2009
hrm, I'm not really sure. I am not the biggest perl guru, so that is why I suggested the background method.
Login or Register to Ask a Question

Previous Thread | Next Thread

9 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Read latest file name with a date and time and then download from FTP

Hi All, Please help. I have requirement to read the file / folder based on the latest date and download the file and folder. There will be files and folders in the location like 20140630-144422 20140630-144422.csv 20140707-182653 20140707-182653.csv 20140710-183153... (7 Replies)
Discussion started by: Praveen Pandit
7 Replies

2. Shell Programming and Scripting

FTP download using perl script

Need assistance I have a script which i can download the files from ftp server using perl . But i want to download multiple files at a time #!/usr/bin/perl -w use Net::FTP; $ftp = Net::FTP->new("ftp.ncdc.noaa.gov"); $ftp->login('username', 'password'); $ftp->cwd("<dir>");... (9 Replies)
Discussion started by: ajayram_arya
9 Replies

3. Shell Programming and Scripting

Ftp multiple files one at a time

Hi, I have a problem while ftp'ing zip files one after the other from linux source directory to a ftp host machine. here is the shell script: #!/bin/ksh dir=data/dir1/dir2 # this dir is linux source directory where zip files located. rmtdir='/home/' echo $dir for i in /$dir/*; do if ;... (7 Replies)
Discussion started by: raj78
7 Replies

4. Programming

Multiple sites running off same code

I m interested in making multiple websites, all with the same basic functionality but will have different designs or templates. is there a way they can all run off the same basic underlying code, but use different templates. what is the best method of doing this. (1 Reply)
Discussion started by: AimyThomas
1 Replies

5. Shell Programming and Scripting

Please suggest the Sites for perl script beginners for better understanding

I am begginer to perl scripting, i like to learn all the functionality of the perl scrpting , Could you please help me on this :confused::confused: (2 Replies)
Discussion started by: jothi basu
2 Replies

6. Shell Programming and Scripting

ftp multiple files at the same time

Hi All, I am using ncftpput in one of my c-shell script to upload files to a remote location. The source files keep coming continuosly and to upload each file ncftpput opens a new connection everytime. It means ncftp uploads the file1 than file2 than file3 .... There is gap 20-25 secs between... (10 Replies)
Discussion started by: sraj142
10 Replies

7. Shell Programming and Scripting

Corrupted Download with Net::FTP (perl)

Hello, I am trying to download several files from a remote FTP server using Net::FTP from within a perl script. The files download alright, except they appear to be corrupted. I know this because once after downloading, I use bzcat to pipe the files to a split command, and bzcat complains saying... (3 Replies)
Discussion started by: amcrisan
3 Replies

8. HP-UX

Ftp Download Problem

FROM WINDOWS , WHILE GETTING DATA FROM HP-UNIX SERVER USING FTP, I AM ABLE TO GET DATA ONLY LESS THAN 4GB FILE, PLZ HELP ME HOW TO GET MORE MORE THAN 4GB FILE. (4 Replies)
Discussion started by: niranjan
4 Replies

9. SuSE

When a FTP download is completed

I receive files by FTP in an input directory. Every 10 seconds a process checks if some new file has arrived to the input directory and then the process moves them to another directory to be read by a second process. The problem is that there is no way to know if the download has finished... (11 Replies)
Discussion started by: Javi
11 Replies
Login or Register to Ask a Question