Sponsored Content
Full Discussion: Problem with wget
Top Forums Shell Programming and Scripting Problem with wget Post 302234702 by max29583 on Wednesday 10th of September 2008 06:52:50 AM
Old 09-10-2008
Problem with wget

Hi,

I want to download some patches from SUN by using a script and I am using "wget" as the utillity for this.

The website for downloading has a "https:" in its name as below
https://sunsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h

and on running wget as below

wget --http-user=${UserID} --http-passwd=${UserPWD} --proxy-user=${PROXYUSER} --proxy-passwd=${PROXYPASSWD} -nv "https://s
unsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h" -O ${DOWNLOADIR}/${line}.zip >> ${LOGFILE} 2>&1

I am getting error as

https://sunsolve.sun.com/private-cgi...1-01&method=h: Unsupported scheme.


Please help me find why the error is coming as Unsupported scheme.

regards,
Abhi
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. UNIX for Advanced & Expert Users

Wget FTP problem!

Hi, I've tried to download from ftp sites by wget but it failed and says "Service unavailable" but when I use sftp in binary mode and use "get" command it works perfectly. What's the problem? BTW: I tried both passive and active mode in wget. thnx for ur help (9 Replies)
Discussion started by: mjdousti
9 Replies

3. UNIX for Dummies Questions & Answers

wget pdf downloading problem

Hi. I am trying to make a mirror of this free online journal: http://www.informaworld.com/smpp/title~content=t716100758~db=all Under the individual issues, the link location for the "Full Text PDF" does not have ".pdf" as an extension -- so when I use wget it misses the file. However clicking... (5 Replies)
Discussion started by: obo1234
5 Replies

4. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

5. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

6. Shell Programming and Scripting

Problem with wget and cookie

Dear people, I got a problem with an scrip using wget to download pdf-files from an website which uses session-cookies. Background: for university its quite nasty to look up weekly which new homeworks, papers etc. are available on the different sites of the universites chairs. So I wanted a... (1 Reply)
Discussion started by: jackomo
1 Replies

7. UNIX for Dummies Questions & Answers

Wget -i URLs.txt problem

Hi Everyone, I have a problem with wget using an input file of URLs. When I execute this -> wget -i URLs.txt I get the login.php pages transferred but not the files I have in the URLs.txt file. I need to use the input file because it will have new products to download each week. I want my VA to... (3 Replies)
Discussion started by: Keith londrie
3 Replies

8. Red Hat

Wget

If I run the following command wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/ A local directory named 10.11.12.13/backups with the content of web site data is created. What I want to do is have the data placed in a local directory called $HOME/backups. Thanks for... (1 Reply)
Discussion started by: popeye
1 Replies

9. Proxy Server

Problem with wget

I cannot download anything using wget in centos 6.5 and 7. But I can update yum etc. # wget https://wordpress.org/latest.tar.gz --2014-10-23 13:50:23-- https://wordpress.org/latest.tar.gz Resolving wordpress.org... 66.155.40.249, 66.155.40.250 Connecting to wordpress.org|66.155.40.249|:443...... (3 Replies)
Discussion started by: nirosha
3 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
cgi_url_validate(3)						     cgi/cgi.h						       cgi_url_validate(3)

NAME
cgi_url_validate - validate that url is of an allowed format SYNOPSIS
#include <cgi/cgi.h> NEOERR *cgi_url_validate (const char *buf, char **esc); ARGUMENTS
buf - a 0 terminated string DESCRIPTION
cgi_url_validate will check that a URL starts with one of the accepted safe schemes. If not, it returns "#" as a safe substitute. Cur- rently accepted schemes are http, https, ftp and mailto. It then html escapes the entire URL so that it is safe to insert in an href attribute. RETURN VALUE
esc - a newly allocated string SEE ALSO
cgi_debug_init(3), cgi_parse(3), cgi_destroy(3), cgi_js_escape(3), cgi_html_escape_strfunc(3), cgi_register_strfuncs(3), cgi_output(3), parse_rfc2388(3), cgi_url_validate(3), open_upload(3), cgi_cs_init(3), cgi_url_escape_more(3), cgi_html_strip_strfunc(3), cgi_neo_error(3), cgi_redirect(3), cgi_filehandle(3), cgi_register_parse_cb(3), cgi_url_escape(3), cgi_init(3), cgi_redirect_uri(3), cgi_cookie_clear(3), cgi_url_unescape(3), cgi_vredirect(3), cgi_display(3), cgi_html_ws_strip(3), cgi_error(3), cgi_cookie_set(3), cgi_text_html_strfunc(3), cgi_cookie_authority ClearSilver 12 July 2007 cgi_url_validate(3)
All times are GMT -4. The time now is 04:31 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy