06-13-2008
download a particular file using wget
Hi All
I want to download srs8.3.0.1.standard.linux24_EM64T.tar.gz file from the following website :
http://downloads.biowisdomsrs.com/srs83_dist/
But this website contains lots of zipped files
I want to download the above file only discarding other zipped files.
When I am trying the following command ,it is downloading all the zipped files one by one
wget -nd -r -l1 --no-parent -A.gz
http://downloads.biowisdomsrs.com/srs83_dist --http-user=glaxowel --http-passwd=M=P28c --proxy-user=gga0855 --proxy-passwd=testme2 --proxy=on
Do we have any option with wget command to download only a particular file or is there any other command
Please help.......
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
Hi
I need a Shell script that will download a text file every second from a http server using wget.
Can anyone provide me any pointers or sample scripts that will help me go about this task ???
regards
techie (1 Reply)
Discussion started by: techie82
1 Replies
2. UNIX for Dummies Questions & Answers
Hello Everyone,
I'm trying to use wget recursively to download a file.
Only html files are being downloaded, instead of the target file.
I'm trying this for the first time, here's what I've tried:
wget -r -O jdk.bin... (4 Replies)
Discussion started by: thoughts
4 Replies
3. UNIX and Linux Applications
I need to download the following srs8.3.0.1.standard.linux26_32.tar.gz file from the following website:
http://downloads.biowisdomsrs.com/srs83_dist
There are many gzip files along with the above one in the above site but I want to download the srs8.3.0.1.standard.linux26_32.tar.gz only from... (1 Reply)
Discussion started by: alphasahoo
1 Replies
4. Shell Programming and Scripting
Hi,
I want to download some online data using wget command and write the contents to a file.
For example this is the URL i want to download and store it in a file called "results.txt".
#This is the URL.
$url="http://www.example.com";
#retrieve data and store in a file results.txt
... (3 Replies)
Discussion started by: vanitham
3 Replies
5. Shell Programming and Scripting
Ok, this is quite weird.
wget -r mysite.com/mylink/
should get all the files recursively from the 'mylink' folder.
The problem is that wget saves an index.html file!
When I open this index.html with my browser I realize that it shows all the files in the current folder (plus an option to move... (3 Replies)
Discussion started by: hakermania
3 Replies
6. Ubuntu
I am using ubuntu 10.04 LTS
I tried to download the file using wget , the file size is 105.00 MB, but wget downloads only around 44K.
may be I am using wget in wrong way, any suggestions please?
Below is the command I used and the response from system.
wget --tries=10 -nd -nH --use=user... (10 Replies)
Discussion started by: LinuxLearner
10 Replies
7. Shell Programming and Scripting
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies
8. Shell Programming and Scripting
Hi
I am trying to download the file using wget command. But The password was created as pwd$$ for the user xyz. When i give the command as below it is not downloading the file. Will the password has $$ causing this issue.
wget... (0 Replies)
Discussion started by: ksmbabu
0 Replies
9. Shell Programming and Scripting
Hello all,
I want to write auto update script for my embedded device, which can check and download newer version of my program and extract the files on the device.
The download center is hosted on remote web server .
Script checks the hosted file on web site and if the new version is there... (8 Replies)
Discussion started by: stefki
8 Replies
10. Shell Programming and Scripting
Hi All,
I am trying to download a XML from a URL through wget and successful in that but the problem is that I have to check for some special characters inside that XML. But when I download through wget it transfers the content of the XML in plain text and I'm not able to search for those... (2 Replies)
Discussion started by: dips_ag
2 Replies
LEARN ABOUT DEBIAN
yaz-url
YAZ-URL(1) Commands YAZ-URL(1)
NAME
yaz-url - YAZ URL fetch utility
SYNOPSIS
yaz-url [-H name:value] [-m method] [-O fname] [-p fname] [-u user/password] [-x proxy] [url...]
DESCRIPTION
yaz-url is utility to get web content. It is very limited in functionality compared to programs such as curl, wget.
The options must be precede the URL given on the command line to take effect.
Fetched HTTP content is written to stdout, unless option -O is given.
OPTIONS
-H name:value
Specifies HTTP header content with name and value. This option can be given multiple times (for different names, of course).
-m method
Specifies the HTTP method to be used for the next URL. Default is method "GET". However, option -p sets it to "POST".
-O fname
Sets output filename for HTTP content.
-p fname
Sets a file to be POSTed in the folloing URL.
-u user/password
Specifies a user and a password to be uesd in HTTP basic authentication in the following URL fetch. The user and password must be
separated by a slash (this it is not possible to specify a user with a slash in it).
-x proxy
Specifies a proxy to be used for URL fetch.
SEE ALSO
yaz(7)
YAZ 4.2.30 04/16/2012 YAZ-URL(1)