Sponsored Content
Top Forums UNIX for Dummies Questions & Answers Wget retry on 500 internal error Post 302885079 by Corona688 on Thursday 23rd of January 2014 10:58:23 AM
Old 01-23-2014
wget doesn't seem to have any options for 'retry after 500 server error', but you should at least be able to tell that wget has failed by its return code or, failing that, its failure to create a file. A script could handle this but it seems like you're feeding this into some other non-shell system.
 

8 More Discussions You Might Find Interesting

1. UNIX for Advanced & Expert Users

Enomem in Journal Retry Error

Hi, Does anyone seen this error before.. kernel: ENOMEM in journal_alloc_journal_head, retrying. I encounter this problem on IBM eServers where when the above error appears usually the machine is dead or hanged. Unless a hard reboot is been done. Is this something have to do with the memory... (1 Reply)
Discussion started by: killerserv
1 Replies

2. Shell Programming and Scripting

500 Internal Server Error

:)Hi, I am working on perl-cgi script which i wrote on unix server, and now i want to run it from windows. Have put DNS entry, sybase and apache is running... But still I am getting 500 Internal Server Error!!! what could be the reason? (2 Replies)
Discussion started by: darshakraut
2 Replies

3. UNIX for Advanced & Expert Users

Forwarding internal internet packets to internal webserver using iptables

Hi, I need to redirect internal internet requests to a auth client site siting on the gateway. Currently users that are authenticated to access the internet have there mac address listed in the FORWARD chain. All other users need to be redirected to a internal site for authentication. Can... (1 Reply)
Discussion started by: mshindo
1 Replies

4. Web Development

weird 500 Internal server error

Hi All, I am seeking some help. While trying to access my website: EDITED (hosted on private server somewhere - don't want to publicize names) - I have a weird behaviour: I can always get to the site - but some applications get a 500 Internal error. If I use FireBug (mozilla addon) I can... (2 Replies)
Discussion started by: saariko
2 Replies

5. Shell Programming and Scripting

Fill the values between -500 to 500 -awk

input -200 2.4 0 2.6 30 2.8 output -500 0 -499 0 -488 0 .......... .......... .... -200 2.4 .... ... 0 2.6 (6 Replies)
Discussion started by: quincyjones
6 Replies

6. Shell Programming and Scripting

500 internal server error

Hi, I need a quick help from GURUs of PERL. I moved a website to a new location and got an error "Internal Server Error" instead of specific error. As i don't know PERL so i don't know how to fix it. Can anybody help me to fix this error or to generate a specific error which i can... (3 Replies)
Discussion started by: shahzad79
3 Replies

7. UNIX for Advanced & Expert Users

How to manipulate the conditions between every retry in wget?

Hi , When i hit the URL using WGET command ,it is retrying according to the number of retry we mentioned along with Wget command. my expectation : 1) If 1st try is failed and iam retrying again before 2nd retry i have to check for "xxxxxxx" entry in the log file. 2) If "XXXXXXX" entry is... (4 Replies)
Discussion started by: vinothsekark
4 Replies

8. Solaris

Unrecovered read error No retry

We encountered these error 2 times(e.g. Solaris 10 with NetWorker installed) with in the month of August, but we couldn't pin point the root cause, it might be bad sector, bad cable or software incompatibility? Do you experience these issue or please share your understanding about this? Thanks... (0 Replies)
Discussion started by: B@S
0 Replies
URIFIND(1p)						User Contributed Perl Documentation					       URIFIND(1p)

NAME
urifind - find URIs in a document and dump them to STDOUT. SYNOPSIS
$ urifind file DESCRIPTION
urifind is a simple script that finds URIs in one or more files (using "URI::Find"), and outputs them to to STDOUT. That's it. To find all the URIs in file1, use: $ urifind file1 To find the URIs in multiple files, simply list them as arguments: $ urifind file1 file2 file3 urifind will read from "STDIN" if no files are given or if a filename of "-" is specified: $ wget http://www.boston.com/ -O - | urifind When multiple files are listed, urifind prefixes each found URI with the file from which it came: $ urifind file1 file2 file1: http://www.boston.com/index.html file2: http://use.perl.org/ This can be turned on for single files with the "-p" ("prefix") switch: $urifind -p file3 file1: http://fsck.com/rt/ It can also be turned off for multiple files with the "-n" ("no prefix") switch: $ urifind -n file1 file2 http://www.boston.com/index.html http://use.perl.org/ By default, URIs will be displayed in the order found; to sort them ascii-betically, use the "-s" ("sort") option. To reverse sort them, use the "-r" ("reverse") flag ("-r" implies "-s"). $ urifind -s file1 file2 http://use.perl.org/ http://www.boston.com/index.html mailto:webmaster@boston.com $ urifind -r file1 file2 mailto:webmaster@boston.com http://www.boston.com/index.html http://use.perl.org/ Finally, urifind supports limiting the returned URIs by scheme or by arbitrary pattern, using the "-S" option (for schemes) and the "-P" option. Both "-S" and "-P" can be specified multiple times: $ urifind -S mailto file1 mailto:webmaster@boston.com $ urifind -S mailto -S http file1 mailto:webmaster@boston.com http://www.boston.com/index.html "-P" takes an arbitrary Perl regex. It might need to be protected from the shell: $ urifind -P 's?html?' file1 http://www.boston.com/index.html $ urifind -P '.org' -S http file4 http://www.gnu.org/software/wget/wget.html Add a "-d" to have urifind dump the refexen generated from "-S" and "-P" to "STDERR". "-D" does the same but exits immediately: $ urifind -P '.org' -S http -D $scheme = '^(http):' @pats = ('^(http):', '.org') To remove duplicates from the results, use the "-u" ("unique") switch. OPTION SUMMARY
-s Sort results. -r Reverse sort results (implies -s). -u Return unique results only. -n Don't include filename in output. -p Include filename in output (0 by default, but 1 if multiple files are included on the command line). -P $re Print only lines matching regex '$re' (may be specified multiple times). -S $scheme Only this scheme (may be specified multiple times). -h Help summary. -v Display version and exit. -d Dump compiled regexes for "-S" and "-P" to "STDERR". -D Same as "-d", but exit after dumping. AUTHOR
darren chamberlain <darren@cpan.org> COPYRIGHT
(C) 2003 darren chamberlain This library is free software; you may distribute it and/or modify it under the same terms as Perl itself. SEE ALSO
URI::Find perl v5.14.2 2012-04-08 URIFIND(1p)
All times are GMT -4. The time now is 07:18 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy