Linux and UNIX Man Pages

Linux & Unix Commands - Search Man Pages

lwp-download(1) [redhat man page]

LWP-DOWNLOAD(1) 					User Contributed Perl Documentation					   LWP-DOWNLOAD(1)

NAME
lwp-download - fetch large files from the net SYNOPSIS
lwp-download [-a] <url> [<local file>] DESCRIPTION
The lwp-download program will down load the document specified by the URL given as the first command line argument to a local file. The local filename used to save the document is guessed from the URL unless specified as the second command line argument. The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you don't have much options to worry about. Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems. EXAMPLE
Fetch the newest and greatest perl version: $ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz Saving to 'latest.tar.gz'... 1.47 MB received in 22 seconds (68.7 KB/sec) AUTHOR
Gisle Aas <gisle@aas.no> libwww-perl-5.65 2002-01-02 LWP-DOWNLOAD(1)

Check Out this Related Man Page

LWP-DOWNLOAD(1) 					User Contributed Perl Documentation					   LWP-DOWNLOAD(1)

NAME
lwp-download - Fetch large files from the web SYNOPSIS
lwp-download [-a] [-s] <url> [<local path>] DESCRIPTION
The lwp-download program will save the file at url to a local file. If local path is not specified, then the current directory is assumed. If local path is a directory, then the last segment of the path of the url is appended to form a local filename. If the url path ends with slash the name "index" is used. With the -s option pick up the last segment of the filename from server provided sources like the Content- Disposition header or any redirect URLs. A file extension to match the server reported Content-Type might also be appended. If a file with the produced filename already exists, then lwp-download will prompt before it overwrites and will fail if its standard input is not a terminal. This form of invocation will also fail is no acceptable filename can be derived from the sources mentioned above. If local path is not a directory, then it is simply used as the path to save into. If the file already exists it's overwritten. The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you don't have much options to worry about. Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems. EXAMPLE
Fetch the newest and greatest perl version: $ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz Saving to 'latest.tar.gz'... 11.4 MB received in 8 seconds (1.43 MB/sec) AUTHOR
Gisle Aas <gisle@aas.no> perl v5.16.3 2012-01-14 LWP-DOWNLOAD(1)
Man Page

3 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Coomand to download from HTTP(URL)

Hi, What is the UNIX command to download a file or data from HTTP location. CURL(Linux) did not work. Thank You (4 Replies)
Discussion started by: skm123
4 Replies

2. Shell Programming and Scripting

Downloading processed HTML of PHP page

Hi, I am trying to obtain the HTML code of a PHP page (http:// areferee .com/soccer/test1.php?quiz=50&ran=1&t=5) after the page has been processed; I want pure HTML. Is there a Unix command I can use to do this? I have tried wget, GET, curl but I am getting odd behaviour i.e. it is not... (9 Replies)
Discussion started by: djcas
9 Replies

3. Shell Programming and Scripting

Monitoring an html web page changes

Hello, I need to monitor an html web page for ANY changes and should be able to know if it's modified or not (since last query). I do not need what modifications but just notification is enough. This is a simple web page and I don't need to parse the links any further. Is it possible to do... (10 Replies)
Discussion started by: prvnrk
10 Replies