Sponsored Content
Full Discussion: wget to check an URL
Top Forums Shell Programming and Scripting wget to check an URL Post 302345006 by AlbertGM on Tuesday 18th of August 2009 08:08:15 AM
Old 08-18-2009
wget to check an URL

I all,
I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like:
Code:
[...]
while [ $rc -ne 0 ]; do
   wget --spider <URL>:<port>/console > /dev/null 2>&1
   rc=$?
done
[...]

This works perfectly because it's an HTML site and when server is started the site is accessible.
Now I need to do the same, but I also need to wait for server instances, which listens at same URL but different port than Server.
I can't use return code of wget because it always return 1.
If instance is up, wget responses:
Quote:
Connecting to 172.18.242.32:7001... conected.
HTTP Request sent... 404 Not Found
13:39:30 ERROR 404: Not Found.
If instance is down:
Quote:
Connecting to 172.18.242.32:7002... failed: Connection refused.
The return code is always 1 because although it was listening there is no HTML web to view as server has.
So how can I check if instance is available or not? Any other bash command to check an URL and port?

Thanks a lot and sorry for english Smilie
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Check URL using PERL

I am trying to create a perl script that will make sure a web page can be accessed going through an Apache httpd. The actual content of the web page does not matter. Most likely the web page will just have "You have successfully reached this port." This script will eventually be running... (5 Replies)
Discussion started by: rehoboth
5 Replies

2. Shell Programming and Scripting

How to get the page size (of a url) using wget

Hi , I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone? Regards, Raj (1 Reply)
Discussion started by: rajbal
1 Replies

3. Shell Programming and Scripting

Check URL with ksh

Hi everybody, I'm currently writing a ksh script which automates the entire startup of a large number of Tibco BusinessWorks domains, as well as all the deployed components running on it. My script is to be used after an infrastructure release, when the entire environement is down. It... (1 Reply)
Discussion started by: HexAnubis666
1 Replies

4. Shell Programming and Scripting

ksh to check url

I have a server that keeps going down (503 Service not available). Until we find out the problem I would like to setup a simple ksh script in cron that will query url and report the status code. This way we can get someone to restart the process. Does anyone know a simple command I can call... (5 Replies)
Discussion started by: oldman2
5 Replies

5. Shell Programming and Scripting

Url Check for a keyword.

thanks (0 Replies)
Discussion started by: kata33
0 Replies

6. UNIX for Dummies Questions & Answers

Launch a URL,validate username and password using wget or curl

Hi All, I want to launch "ex: http://gmail.com" from the cmd window and validate the credentials with username and password, is it possible? I have found something like this "wget --http-user=USER' --http-password=PASSWORD http://gmail.com" am new to this and unable to find a solution, i... (0 Replies)
Discussion started by: harsha85
0 Replies

7. UNIX for Dummies Questions & Answers

Read URL data from UNIX without wget,curl,lynx,w3m.

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts. My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled... (0 Replies)
Discussion started by: scott_cog
0 Replies

8. Shell Programming and Scripting

Read URL data from UNIX-CLI without Wget,CURL,w3m,LWP

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts.My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled it.... (12 Replies)
Discussion started by: scott_cog
12 Replies

9. Shell Programming and Scripting

How to check if the URL exists?

Hi, I need to check if the URL exists. Below is my OS: SunOS mymac1 Generic_148888-04 sun4v sparc SUNW,SPARC-Enterprise-T5220 I do not have the curl set in the profile nor am i aware about its path. But i have wget. Please help me with params for the same. Can you help me check if... (6 Replies)
Discussion started by: mohtashims
6 Replies

10. Shell Programming and Scripting

Wget fails for a valid URL

Wget Error Codes: 0 No problems occurred. 1 Generic error code. 2 Parse error—for instance, when parsing command-line options, the .wgetrc or .netrc… 3 File I/O error. 4 Network failure. 5 SSL verification failure. 6 Username/password authentication failure. ... (3 Replies)
Discussion started by: mohtashims
3 Replies
JIGDO-LITE(1)															     JIGDO-LITE(1)

NAME
jigdo-lite - Download jigdo files using wget SYNOPSIS
jigdo-lite [ URL ] DESCRIPTION
See jigdo-file(1) for an introduction to Jigsaw Download. Given the URL of a `.jigdo' file, jigdo-lite downloads the large file (e.g. a CD image) that has been made available through that URL. wget(1) is used to download the necessary pieces of administrative data (contained in the `.jigdo' file and a corresponding `.template' file) as well as the many pieces that the large file is made from. The jigdo-file(1) utility is used to reconstruct the large file from the pieces. `.jigdo' files that contain references to Debian mirrors are treated specially: When such a file is recognized, you are asked to select one mirror out of a list of all Debian mirrors. If URL is not given on the command line, the script prompts for a location to download the `.jigdo' file from. The following command line options are recognized: -h --help Output short summary of command syntax. -v --version Output version number. --scan FILES Do not ask for "Files to scan", use this path. --noask Do not ask any questions, instead behave as if the user had pressed Return at all prompts. This can be useful when running jigdo- lite from cron jobs or in other non-interactive environments. SEE ALSO
jigdo-file(1), jigdo-mirror(1), wget(1) (or `info wget') CD images for Debian Linux can be downloaded with jigdo <URL:http://www.debian.org/CD/jigdo-cd/>. AUTHOR
Jigsaw Download <URL:http://atterer.net/jigdo/> was written by Richard Atterer <jigdo atterer.net>, to make downloading of CD ROM images for the Debian Linux distribution more convenient. 19 May 2006 JIGDO-LITE(1)
All times are GMT -4. The time now is 03:25 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy