Sponsored Content
Full Discussion: Problem with wget
Special Forums IP Networking Proxy Server Problem with wget Post 302922210 by nirosha on Thursday 23rd of October 2014 08:40:00 AM
Old 10-23-2014
Quote:
Originally Posted by sea
Seems to be working (in general):
Code:
:) ~ $ wget https://wordpress.org/latest.tar.gz
--2014-10-23 10:41:22--  https://wordpress.org/latest.tar.gz
Auflösen des Hostnamen »wordpress.org (wordpress.org)«... 66.155.40.249, 66.155.40.250
Verbindungsaufbau zu wordpress.org (wordpress.org)|66.155.40.249|:443... verbunden.
HTTP-Anforderung gesendet, warte auf Antwort... 200 OK
Länge: 6051082 (5.8M) [application/octet-stream]
In »»latest.tar.gz«« speichern.

100%[=====================================================================================================================>] 6'051'082   1.21MB/s   in 5.5s   

2014-10-23 10:41:29 (1.05 MB/s) - »»latest.tar.gz«« gespeichert [6051082/6051082]

Did you try with curl?
Allthough i would expect the same behaviour.

Code:
curl -o latest.tar.gz https://wordpress.org/latest.tar.gz

Either way, you can pass the (your) proxy server (information).
Code:
+ ~ $ wget --help |grep proxy
       --no-proxy                explicitly turn off proxy.
       --proxy-user=USER       set USER as proxy username.
       --proxy-password=PASS   set PASS as proxy password.

Hope this helps

Thank you for kind help. curl didn't work either as you said.

Therefore I added the proxy settings to the wget configuration file. Then it worked but stopped saying certificate error. Therefore I did --no-check-certificate. Then it worked.

Any idea what was the wrong? Also ping to google did not worked too.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

wget -r

I have noticed a lot of expensive books appearing online so I have decided to copy them to CD. I was going to write a program in java to do this, but remembered that wget GNU program some of you guys were talking about. Instead of spending two hours or so writing a program to do this.... (1 Reply)
Discussion started by: photon
1 Replies

2. UNIX for Advanced & Expert Users

Wget FTP problem!

Hi, I've tried to download from ftp sites by wget but it failed and says "Service unavailable" but when I use sftp in binary mode and use "get" command it works perfectly. What's the problem? BTW: I tried both passive and active mode in wget. thnx for ur help (9 Replies)
Discussion started by: mjdousti
9 Replies

3. Shell Programming and Scripting

Problem with wget

Hi, I want to download some patches from SUN by using a script and I am using "wget" as the utillity for this. The website for downloading has a "https:" in its name as below https://sunsolve.sun.com/private-cgi/pdownload.pl?target=${line}&method=h and on running wget as below wget... (1 Reply)
Discussion started by: max29583
1 Replies

4. UNIX for Dummies Questions & Answers

wget pdf downloading problem

Hi. I am trying to make a mirror of this free online journal: http://www.informaworld.com/smpp/title~content=t716100758~db=all Under the individual issues, the link location for the "Full Text PDF" does not have ".pdf" as an extension -- so when I use wget it misses the file. However clicking... (5 Replies)
Discussion started by: obo1234
5 Replies

5. UNIX for Dummies Questions & Answers

Problem with wget no check certificate.

Hi, I'm trying to install some libraries, when running the makefile I get an error from the "wget --no check certificate option". I had a look help and the option wasn't listed. Anyone know what I'm missing. (0 Replies)
Discussion started by: davcra
0 Replies

6. Shell Programming and Scripting

wget help?

can someone please help in understanding this shell script? wget --progress=dot:mega --cut-dirs=4 -r -c -nH -np --reject index.html*,icons/*.gif \ http://*****.oz.xxxxx.com:<portnum>/omcsm/releases/dew/${UPGRADE_VERSION}/ (1 Reply)
Discussion started by: dnam9917
1 Replies

7. Shell Programming and Scripting

Problem with wget and cookie

Dear people, I got a problem with an scrip using wget to download pdf-files from an website which uses session-cookies. Background: for university its quite nasty to look up weekly which new homeworks, papers etc. are available on the different sites of the universites chairs. So I wanted a... (1 Reply)
Discussion started by: jackomo
1 Replies

8. UNIX for Dummies Questions & Answers

Wget -i URLs.txt problem

Hi Everyone, I have a problem with wget using an input file of URLs. When I execute this -> wget -i URLs.txt I get the login.php pages transferred but not the files I have in the URLs.txt file. I need to use the input file because it will have new products to download each week. I want my VA to... (3 Replies)
Discussion started by: Keith londrie
3 Replies

9. Red Hat

Wget

If I run the following command wget -r --no-parent --reject "index.html*" 10.11.12.13/backups/ A local directory named 10.11.12.13/backups with the content of web site data is created. What I want to do is have the data placed in a local directory called $HOME/backups. Thanks for... (1 Reply)
Discussion started by: popeye
1 Replies

10. Shell Programming and Scripting

Wget - working in browser but cannot download from wget

Hi, I need to download a zip file from my the below US govt link. https://www.sam.gov/SAMPortal/extractfiledownload?role=WW&version=SAM&filename=SAM_PUBLIC_MONTHLY_20160207.ZIP I only have wget utility installed on the server. When I use the below command, I am getting error 403... (2 Replies)
Discussion started by: Prasannag87
2 Replies
CURLOPT_HTTPPROXYTUNNEL(3)				     curl_easy_setopt options					CURLOPT_HTTPPROXYTUNNEL(3)

NAME
CURLOPT_HTTPPROXYTUNNEL - tunnel through HTTP proxy SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_setopt(CURL *handle, CURLOPT_HTTPPROXYTUNNEL, long tunnel); DESCRIPTION
Set the parameter to 1 to make libcurl tunnel all operations through the HTTP proxy. There is a big difference between using a proxy and to tunnel through it. If you don't know what this means, you probably don't want this tunneling option. Tunneling essentially means that a CONNECT is sent to the proxy, asking it to connect to a remote host on a specific port number and then the traffic is just passed through the proxy. Proxies tend to whitelist specific port numbers it allows CONNECT requests to and often only port 80 and 443 are allowed. When using this, it only makes sense to use CURLOPT_PROXYTYPE(3) set to a HTTP proxy. To suppress proxy CONNECT response headers from user callbacks use CURLOPT_SUPPRESS_CONNECT_HEADERS(3). DEFAULT
0 PROTOCOLS
All network protocols EXAMPLE
TODO AVAILABILITY
Always RETURN VALUE
Returns CURLE_OK SEE ALSO
CURLOPT_PROXY(3), CURLOPT_PROXYTYPE(3), CURLOPT_PROXYPORT(3), libcurl 7.54.0 April 28, 2016 CURLOPT_HTTPPROXYTUNNEL(3)
All times are GMT -4. The time now is 08:51 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy