Sponsored Content
Full Discussion: Wget fails for a valid URL
Top Forums Shell Programming and Scripting Wget fails for a valid URL Post 302889834 by mohtashims on Monday 24th of February 2014 04:32:38 AM
Old 02-24-2014
Question Wget fails for a valid URL

Wget Error Codes:
Code:
    0 No problems occurred.
    1 Generic error code.
    2 Parse error—for instance, when parsing command-line options, the .wgetrc or .netrc…
    3 File I/O error.
    4 Network failure.
    5 SSL verification failure.
    6 Username/password authentication failure.
    7 Protocol errors.
    8 Server issued an error response.

I get error code 8 instead of 0 with wget -q --no-check-certificate $url and no output..

Without the wget --no-check-certificate $url option below is the output received.

Code:
--2014-02-23 08:37:53--  https://cxyz:9443/Services/Account.jws
Resolving cxyz... 255.275.82.125
Connecting to cxyz|255.275.82.125|:9443... connected.
WARNING: cannot verify cxyz's certificate, issued by  `/C=US/O=xxxx./OU=xxxx/OU=Terms of use at https://www.verisign.com/rpa  (c)10/CN=VeriSign Class 3 Secure Server CA - G3':
  Self-signed certificate encountered.
HTTP request sent, awaiting response... 500 Server Error
2014-02-23 08:37:53 ERROR 500: Server Error.

Please suggest a fix !!

Error code 8.

How can i get error code 0 to appear for a invalid certificate URL because all i need to bother is if the URL exists or not.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

How to get the page size (of a url) using wget

Hi , I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone? Regards, Raj (1 Reply)
Discussion started by: rajbal
1 Replies

2. Shell Programming and Scripting

wget to check an URL

I all, I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like: while ; do wget --spider <URL>:<port>/console > /dev/null 2>&1 rc=$? done This works perfectly because it's an HTML site and when server is... (2 Replies)
Discussion started by: AlbertGM
2 Replies

3. Shell Programming and Scripting

how to judge wether a url is valid or not using awk

rt 3ks:confused: (6 Replies)
Discussion started by: rainboisterous
6 Replies

4. Linux

downlading using wget fails

I'm trying to build Thrift library..It downloads some files from the Maven repo during the build process. It failed while downloading a file called "ivy-2.2.0.jar" Then I downloaded that file using my browser and I succeeded. After that I tried to download the same file using wget but it failed.... (13 Replies)
Discussion started by: xyzt
13 Replies

5. UNIX for Dummies Questions & Answers

Awk: print all URL addresses between iframe tags without repeating an already printed URL

Here is what I have so far: find . -name "*php*" -or -name "*htm*" | xargs grep -i iframe | awk -F'"' '/<iframe*/{gsub(/.\*iframe>/,"\"");print $2}' Here is an example content of a PHP or HTM(HTML) file: <iframe src="http://ADDRESS_1/?click=5BBB08\" width=1 height=1... (18 Replies)
Discussion started by: striker4o
18 Replies

6. UNIX for Dummies Questions & Answers

Launch a URL,validate username and password using wget or curl

Hi All, I want to launch "ex: http://gmail.com" from the cmd window and validate the credentials with username and password, is it possible? I have found something like this "wget --http-user=USER' --http-password=PASSWORD http://gmail.com" am new to this and unable to find a solution, i... (0 Replies)
Discussion started by: harsha85
0 Replies

7. UNIX for Dummies Questions & Answers

Read URL data from UNIX without wget,curl,lynx,w3m.

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts. My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled... (0 Replies)
Discussion started by: scott_cog
0 Replies

8. Shell Programming and Scripting

Read URL data from UNIX-CLI without Wget,CURL,w3m,LWP

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts.My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled it.... (12 Replies)
Discussion started by: scott_cog
12 Replies

9. Shell Programming and Scripting

Reading URL using Mechanize and dump all the contents of the URL to a file

Hello, Am very new to perl , please help me here !! I need help in reading a URL from command line using PERL:: Mechanize and needs all the contents from the URL to get into a file. below is the script which i have written so far , #!/usr/bin/perl use LWP::UserAgent; use... (2 Replies)
Discussion started by: scott_cog
2 Replies

10. UNIX for Beginners Questions & Answers

Regex for a valid URL

Hi guys, What is the regex to check for only valid URL from a file using grep? (2 Replies)
Discussion started by: Meeran Rizvi
2 Replies
bup-margin(1)						      General Commands Manual						     bup-margin(1)

NAME
bup-margin - figure out your deduplication safety margin SYNOPSIS
bup margin [options...] DESCRIPTION
bup margin iterates through all objects in your bup repository, calculating the largest number of prefix bits shared between any two entries. This number, n, identifies the longest subset of SHA-1 you could use and still encounter a collision between your object ids. For example, one system that was tested had a collection of 11 million objects (70 GB), and bup margin returned 45. That means a 46-bit hash would be sufficient to avoid all collisions among that set of objects; each object in that repository could be uniquely identified by its first 46 bits. The number of bits needed seems to increase by about 1 or 2 for every doubling of the number of objects. Since SHA-1 hashes have 160 bits, that leaves 115 bits of margin. Of course, because SHA-1 hashes are essentially random, it's theoretically possible to use many more bits with far fewer objects. If you're paranoid about the possibility of SHA-1 collisions, you can monitor your repository by running bup margin occasionally to see if you're getting dangerously close to 160 bits. OPTIONS
--predict Guess the offset into each index file where a particular object will appear, and report the maximum deviation of the correct answer from the guess. This is potentially useful for tuning an interpolation search algorithm. --ignore-midx don't use .midx files, use only .idx files. This is only really useful when used with --predict. EXAMPLE
$ bup margin Reading indexes: 100.00% (1612581/1612581), done. 40 40 matching prefix bits 1.94 bits per doubling 120 bits (61.86 doublings) remaining 4.19338e+18 times larger is possible Everyone on earth could have 625878182 data sets like yours, all in one repository, and we would expect 1 object collision. $ bup margin --predict PackIdxList: using 1 index. Reading indexes: 100.00% (1612581/1612581), done. 915 of 1612581 (0.057%) SEE ALSO
bup-midx(1), bup-save(1) BUP
Part of the bup(1) suite. AUTHORS
Avery Pennarun <apenwarr@gmail.com>. Bup unknown- bup-margin(1)
All times are GMT -4. The time now is 06:30 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy