Sponsored Content
Full Discussion: Problem with wget
Top Forums Shell Programming and Scripting Problem with wget Post 302234702 by max29583 on Wednesday 10th of September 2008 06:52:50 AM
Old 09-10-2008
Problem with wget

Hi,

I want to download some patches from SUN by using a script and I am using "wget" as the utillity for this.

The website for downloading has a "https:" in its name as below
https://sunsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h

and on running wget as below

wget --http-user=${UserID} --http-passwd=${UserPWD} --proxy-user=${PROXYUSER} --proxy-passwd=${PROXYPASSWD} -nv "https://s
unsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h" -O ${DOWNLOADIR}/${line}.zip >> ${LOGFILE} 2>&1

I am getting error as

https://sunsolve.sun.com/private-cgi...1-01&method=h: Unsupported scheme.


Please help me find why the error is coming as Unsupported scheme.

regards,
Abhi
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. UNIX for Advanced & Expert Users

Wget FTP problem!

Hi, I've tried to download from ftp sites by wget but it failed and says "Service unavailable" but when I use sftp in binary mode and use "get" command it works perfectly. What's the problem? BTW: I tried both passive and active mode in wget. thnx for ur help (9 Replies)
Discussion started by: mjdousti
9 Replies

3. UNIX for Dummies Questions & Answers

wget pdf downloading problem

Hi. I am trying to make a mirror of this free online journal: http://www.informaworld.com/smpp/title~content=t716100758~db=all Under the individual issues, the link location for the "Full Text PDF" does not have ".pdf" as an extension -- so when I use wget it misses the file. However clicking... (5 Replies)
Discussion started by: obo1234
5 Replies

4. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

5. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

6. Shell Programming and Scripting

Problem with wget and cookie

Dear people, I got a problem with an scrip using wget to download pdf-files from an website which uses session-cookies. Background: for university its quite nasty to look up weekly which new homeworks, papers etc. are available on the different sites of the universites chairs. So I wanted a... (1 Reply)
Discussion started by: jackomo
1 Replies

7. UNIX for Dummies Questions & Answers

Wget -i URLs.txt problem

Hi Everyone, I have a problem with wget using an input file of URLs. When I execute this -> wget -i URLs.txt I get the login.php pages transferred but not the files I have in the URLs.txt file. I need to use the input file because it will have new products to download each week. I want my VA to... (3 Replies)
Discussion started by: Keith londrie
3 Replies

8. Red Hat

Wget

If I run the following command wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/ A local directory named 10.11.12.13/backups with the content of web site data is created. What I want to do is have the data placed in a local directory called $HOME/backups. Thanks for... (1 Reply)
Discussion started by: popeye
1 Replies

9. Proxy Server

Problem with wget

I cannot download anything using wget in centos 6.5 and 7. But I can update yum etc. # wget https://wordpress.org/latest.tar.gz --2014-10-23 13:50:23-- https://wordpress.org/latest.tar.gz Resolving wordpress.org... 66.155.40.249, 66.155.40.250 Connecting to wordpress.org|66.155.40.249|:443...... (3 Replies)
Discussion started by: nirosha
3 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
httpindex(1)						      General Commands Manual						      httpindex(1)

NAME
httpindex - HTTP front-end for SWISH++ indexer SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ] DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc- tory structure) can be kept, deleted, or replaced with their descriptions after indexing. OPTIONS
wget Options The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the EXAMPLE.) httpindex Options httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V. The following options are unique to httpindex: -d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See the extract_description() function in WWW(3) for details about how descriptions are extracted.) -D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with copies of remote files. EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally: wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 | httpindex -d -e'html:*.html,text:*.txt' Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex. EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise. CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.'' The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want to do: httpindex -e'html:*.html' -e'text:*.txt' do this instead: httpindex -e'html:*.html,text:*.txt' SEE ALSO
index++(1), wget(1), WWW(3) AUTHOR
Paul J. Lucas <pauljlucas@mac.com> SWISH++ August 2, 2005 httpindex(1)
All times are GMT -4. The time now is 12:34 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy