04-21-2012
I guess that you are not able to start on destination server service like ssh (scp) for ftp?
If no I think you have to use external package like "axel" (
source), "aria2" (
source2) or aget (
source)
Kind regards
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi
I need a Shell script that will download a text file every second from a http server using wget.
Can anyone provide me any pointers or sample scripts that will help me go about this task ???
regards
techie (1 Reply)
Discussion started by: techie82
1 Replies
2. UNIX for Dummies Questions & Answers
Hello Everyone,
I'm trying to use wget recursively to download a file.
Only html files are being downloaded, instead of the target file.
I'm trying this for the first time, here's what I've tried:
wget -r -O jdk.bin... (4 Replies)
Discussion started by: thoughts
4 Replies
3. Shell Programming and Scripting
Hi All
I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website :
http://downloads.biowisdomsrs.com/srs83_dist/
But this website contains lots of zipped files
I want to download the above file only discarding other zipped files.
When I am trying the... (1 Reply)
Discussion started by: alphasahoo
1 Replies
4. UNIX and Linux Applications
I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website:
http://downloads.biowisdomsrs.com/srs83_dist
There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from... (1 Reply)
Discussion started by: alphasahoo
1 Replies
5. Shell Programming and Scripting
Hi,
I want to download some online data using wget command and write the contents to a file.
For example this is the URL i want to download and store it in a file called "results.txt".
#This is the URL.
$url="http://www.example.com";
#retrieve data and store in a file results.txt
... (3 Replies)
Discussion started by: vanitham
3 Replies
6. Ubuntu
I am using ubuntu 10.04 LTS
I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K.
may be I am using wget in wrong way, any suggestions please?
Below is the command I used and the response from system.
wget --tries=10 -nd -nH --use=user... (10 Replies)
Discussion started by: LinuxLearner
10 Replies
7. UNIX for Dummies Questions & Answers
Hi,
For an order I requested, the provider has uploaded a tar file in public FTP site which internally has tons of files (compressed) and I need to download files that follows particular pattern which would be few hundreds.
Note: The order can't be requested for files that follows the... (7 Replies)
Discussion started by: Amalan
7 Replies
8. Shell Programming and Scripting
Hello all,
I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device.
The download center is hosted on remote web server .
Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies
9. Shell Programming and Scripting
Hi All,
I am trying to download a XML from a URL through wget and successful in that but the problem is that I have to check for some special characters inside that XML. But when I download through wget it transfers the content of the XML in plain text and I'm not able to search for those... (2 Replies)
Discussion started by: dips_ag
2 Replies
10. Shell Programming and Scripting
Hello guys, first post sorry if I did some mess here =)
Using Ubuntu 14.04lts 64bits server version.
I have a list (url.list) with only URLs to download, one per line, that looks like this:
http://domain.com/teste.php?a=2&b=3&name=1
http://domain.com/teste.php?a=2&b=3&name=2
...... (6 Replies)
Discussion started by: tonispa
6 Replies
LEARN ABOUT OSX
uri::url5.12
URI::URL(3) User Contributed Perl Documentation URI::URL(3)
NAME
URI::URL - Uniform Resource Locators
SYNOPSIS
$u1 = URI::URL->new($str, $base);
$u2 = $u1->abs;
DESCRIPTION
This module is provided for backwards compatibility with modules that depend on the interface provided by the "URI::URL" class that used to
be distributed with the libwww-perl library.
The following differences exist compared to the "URI" class interface:
o The URI::URL module exports the url() function as an alternate constructor interface.
o The constructor takes an optional $base argument. The "URI::URL" class is a subclass of "URI::WithBase".
o The URI::URL->newlocal class method is the same as URI::file->new_abs.
o URI::URL::strict(1)
o $url->print_on method
o $url->crack method
o $url->full_path: same as ($uri->abs_path || "/")
o $url->netloc: same as $uri->authority
o $url->epath, $url->equery: same as $uri->path, $uri->query
o $url->path and $url->query pass unescaped strings.
o $url->path_components: same as $uri->path_segments (if you don't consider path segment parameters)
o $url->params and $url->eparams methods
o $url->base method. See URI::WithBase.
o $url->abs and $url->rel have an optional $base argument. See URI::WithBase.
o $url->frag: same as $uri->fragment
o $url->keywords: same as $uri->query_keywords
o $url->localpath and friends map to $uri->file.
o $url->address and $url->encoded822addr: same as $uri->to for mailto URI
o $url->groupart method for news URI
o $url->article: same as $uri->message
SEE ALSO
URI, URI::WithBase
COPYRIGHT
Copyright 1998-2000 Gisle Aas.
perl v5.12.5 2011-08-13 URI::URL(3)