MIRRORTOOL(1) OMT documentation. MIRRORTOOL(1)NAME
mirrortool.pl - OpaL Mirror Tool (OMT)
DESCRIPTION
Creates a mirror of a webpage. It has a number of features such as link rewriting and more. (See the options below).
USAGE
mirrortool.pl [options] [url] [options] [url] [...]
OPTIONS --images : Include <img src=xxx>:s in the download. (default)
--noimages : Do not include <img src=xxx>:s in the download.
--depth n : Maximum recursion depth. (default 1)
--store "regexp" : Files matching regexp are actually stored locally.
: It is possible to | separate (with or).
--rewrite "from=>to" : Urls are rewritten using this rules.
: It is possible to | separate (with or).
: Do not rewrite the dir, because that it will affect
: later lookup. Have to fix this sometime.
--what "regexp" : Files matching regexp are downloaded and traversed.
: It is possible to | separate (with or).
--dir basedir : Where to store local files.
--nohostcheck : Do not check if url points to other host.
--notreecheck : Do not check if url points to other dirtree.
--force : Overwrite all files.
--debug : Print debug-messages.
--retry n : Number of times an url will be retried (default 1)
--auth user:pass : use Basic Authentication
--proxy url : Use a proxy server (like http://u:p@localhost/).
--help : Print this text.
AUTHOR
Ola Lundqvist <opal@lysator.liu.se>
SEE ALSO mirrortool.pl(1)perl v5.8.8 2002-04-15 MIRRORTOOL(1)
Check Out this Related Man Page
LWP-DOWNLOAD(1) User Contributed Perl Documentation LWP-DOWNLOAD(1)NAME
lwp-download - Fetch large files from the web
SYNOPSIS
lwp-download [-a] [-s] <url> [<local path>]
DESCRIPTION
The lwp-download program will save the file at url to a local file.
If local path is not specified, then the current directory is assumed.
If local path is a directory, then the last segment of the path of the url is appended to form a local filename. If the url path ends with
slash the name "index" is used. With the -s option pick up the last segment of the filename from server provided sources like the Content-
Disposition header or any redirect URLs. A file extension to match the server reported Content-Type might also be appended. If a file
with the produced filename already exists, then lwp-download will prompt before it overwrites and will fail if its standard input is not a
terminal. This form of invocation will also fail is no acceptable filename can be derived from the sources mentioned above.
If local path is not a directory, then it is simply used as the path to save into. If the file already exists it's overwritten.
The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request
program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you
don't have much options to worry about.
Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems.
EXAMPLE
Fetch the newest and greatest perl version:
$ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz
Saving to 'latest.tar.gz'...
11.4 MB received in 8 seconds (1.43 MB/sec)
AUTHOR
Gisle Aas <gisle@aas.no>
perl v5.12.1 2010-07-05 LWP-DOWNLOAD(1)
Well i've been looking for some unix systems to download but with all the technical stuff they talk about on the sites i think that it would be betterif i just bought oneat a store so it comeswith directions and stuff, but is there any unix system that will coincidentally run with MS-dos mode? and... (1 Reply)
Hi all, how could i do ?
I have a Rss file, i want to extract only the Urls (many) matching http://www.xxx.com/trailers/ from that file and copy into another file.
like
"
<pubDate>Wed, 29 Apr 2009 00:00:00 PST</pubDate>
<content:encoded><!Apple - Movie Trailers - The Hangover"><img... (3 Replies)
Hi,
I am new to Apache but I have requirement as follows.
if the url is http://images/data1/templates/ it should redirect to http:/172.20.224.23/templates/
if the url doesn't have "data1/templates" (mean http://images/) it should redirect to http://images:8080/.
I tried as below
... (3 Replies)
Hi everybody, I would greatly appreciate some expertise in this matter. I am trying find an efficient way to batch download files from a website and rename each file with the url it originated from (from the CLI). (ie. Instead of xyz.zip, the output file would be http://www.abc.com/xyz.zip) A... (10 Replies)
hi, sorry if this seems trivial.
i have a file url.txt which consists of a list of urls (it was supposed to be my wget -i file). however, since the server from which i am trying to download uses redirect, wget dows not remeber the filename of ther original url will save to a file name which is... (3 Replies)
how to link the linux files in perl on the local webpage ????
suppose we have some results and want to get them published on the local webpage of our internal site. how this can be done using HTML and perl together , so that the results are published directly on the webpage.
thanks
kullu (0 Replies)
Hello !
I need to run two diff script to two different remote server and then compare flies from server3.
script1 - server 1 - /xxx/xxx/dir
scrpt 2 - server 2 /xxx/xxx/dir
script1 already will generate script1.output1 and
script2 also will generate script2.output2
Compare:
... (2 Replies)
Hi ,
I am trying to write a mod_header module rule which will look a specific url (https://partner.testing.com) and rewrite it. The header line is given below. where the url comes in between of the line. i know ^ expression can be used for match the beginning of the line. but not sure how to... (3 Replies)
Hi,
I am trying to read version from a web url like <url/daily-builds/sdk-3.2.v20140312170355-osx.zip>
and download and store in my filesystem if its the latest.
Kindly help me how to read version 3.2.v20140312170355 and compare same with a folder named similar in my directory and... (1 Reply)
Hi friends,
I have UNIX (HP-UX) server. I want to open a webpage (lets say www.unixhelp.com) on this webpage there is one checkbox. I just need to check it. and click on save.
Its just take a half a minute to do so in windows system. But I am wondering if this is possible though UNIX server.... (3 Replies)
I am attempting to download a url in csv format. When I download this url in a browser excel opens up and automatically populates with comma separated values.
When I try to use curl or wget I get nothing or garbage.
This on the command line just hangs
wget -b... (2 Replies)