06-13-2008
download file using wget
I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website:
http://downloads.biowisdomsrs.com/srs83_dist
There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from the website.
wget
http://downloads.biowisdomsrs.com/srs83_dist -nd -P SRS_drop_area -N -r -l 1
--http-user=$hu --http-passwd=$hp \
--proxy-user=$pu --proxy-passwd=$pp \
--proxy=on
But it is downloading all the zipped files one after another but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only
How do I do that..
Please help
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi
I need a Shell script that will download a text file every second from a http server using wget.
Can anyone provide me any pointers or sample scripts that will help me go about this task ???
regards
techie (1 Reply)
Discussion started by: techie82
1 Replies
2. UNIX for Dummies Questions & Answers
Hello Everyone,
I'm trying to use wget recursively to download a file.
Only html files are being downloaded, instead of the target file.
I'm trying this for the first time, here's what I've tried:
wget -r -O jdk.bin... (4 Replies)
Discussion started by: thoughts
4 Replies
3. Shell Programming and Scripting
Hi All
I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website :
http://downloads.biowisdomsrs.com/srs83_dist/
But this website contains lots of zipped files
I want to download the above file only discarding other zipped files.
When I am trying the... (1 Reply)
Discussion started by: alphasahoo
1 Replies
4. Shell Programming and Scripting
Hi,
I want to download some online data using wget command and write the contents to a file.
For example this is the URL i want to download and store it in a file called "results.txt".
#This is the URL.
$url="http://www.example.com";
#retrieve data and store in a file results.txt
... (3 Replies)
Discussion started by: vanitham
3 Replies
5. Shell Programming and Scripting
Ok, this is quite weird.
wget -r mysite.com/mylink/
should get all the files recursively from the 'mylink' folder.
The problem is that wget saves an index.html file!
When I open this index.html with my browser I realize that it shows all the files in the current folder (plus an option to move... (3 Replies)
Discussion started by: hakermania
3 Replies
6. Ubuntu
I am using ubuntu 10.04 LTS
I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K.
may be I am using wget in wrong way, any suggestions please?
Below is the command I used and the response from system.
wget --tries=10 -nd -nH --use=user... (10 Replies)
Discussion started by: LinuxLearner
10 Replies
7. Shell Programming and Scripting
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies
8. Shell Programming and Scripting
Hi
I am trying to download the file using wget command. But The password was created as pwd$$ for the user xyz. When i give the command as below it is not downloading the file. Will the password has $$ causing this issue.
wget... (0 Replies)
Discussion started by: ksmbabu
0 Replies
9. Shell Programming and Scripting
Hello all,
I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device.
The download center is hosted on remote web server .
Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies
10. Shell Programming and Scripting
Hi All,
I am trying to download a XML from a URL through wget and successful in that but the problem is that I have to check for some special characters inside that XML. But when I download through wget it transfers the content of the XML in plain text and I'm not able to search for those... (2 Replies)
Discussion started by: dips_ag
2 Replies
LEARN ABOUT MOJAVE
lwp-download5.18
LWP-DOWNLOAD(1) User Contributed Perl Documentation LWP-DOWNLOAD(1)
NAME
lwp-download - Fetch large files from the web
SYNOPSIS
lwp-download [-a] [-s] <url> [<local path>]
DESCRIPTION
The lwp-download program will save the file at url to a local file.
If local path is not specified, then the current directory is assumed.
If local path is a directory, then the last segment of the path of the url is appended to form a local filename. If the url path ends with
slash the name "index" is used. With the -s option pick up the last segment of the filename from server provided sources like the Content-
Disposition header or any redirect URLs. A file extension to match the server reported Content-Type might also be appended. If a file
with the produced filename already exists, then lwp-download will prompt before it overwrites and will fail if its standard input is not a
terminal. This form of invocation will also fail is no acceptable filename can be derived from the sources mentioned above.
If local path is not a directory, then it is simply used as the path to save into. If the file already exists it's overwritten.
The lwp-download program is implemented using the libwww-perl library. It is better suited to down load big files than the lwp-request
program because it does not store the file in memory. Another benefit is that it will keep you updated about its progress and that you
don't have much options to worry about.
Use the "-a" option to save the file in text (ascii) mode. Might make a difference on dosish systems.
EXAMPLE
Fetch the newest and greatest perl version:
$ lwp-download http://www.perl.com/CPAN/src/latest.tar.gz
Saving to 'latest.tar.gz'...
11.4 MB received in 8 seconds (1.43 MB/sec)
AUTHOR
Gisle Aas <gisle@aas.no>
perl v5.18.2 2012-01-13 LWP-DOWNLOAD(1)