12-02-2010
Checking Web Pages?
Hey guys,
Unfortunatley, I can not use wget on our systems....
I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application.
Has anyone saw this before?
Thanks,
Ryan
9 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Hi,
my company is considering a new development of our web site, which used to run on Apachi over Solaris.
The company who is going to do this for us knows only about developing it in ASP.
I guess this means we'll have to have another ISS server on NT for these dynamic pages :(
What are... (5 Replies)
Discussion started by: me2unix
5 Replies
2. Shell Programming and Scripting
Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" .
i want the output to look something like this:
Total number of web pages: (number)
Total number of links: (number)
Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies
3. UNIX for Dummies Questions & Answers
Hi All!
Is this possible?
I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these...
http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies
4. Shell Programming and Scripting
Hi Guru's,
I need to check the availability of a web page for every one hour through a script. Following is the requirement.
1. Go to http://vsn.srivsn.com
2. If any error encountered in openeing the home page, it should trigger an email with the appropriate error message.
3. If page opens... (6 Replies)
Discussion started by: srivsn
6 Replies
5. Shell Programming and Scripting
hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies
6. UNIX for Dummies Questions & Answers
I can't quite seem to understand what the curl command does with a web address. I tried this:
curl O'Reilly Media: Tech Books, Conferences, Courses, News
but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies
7. UNIX for Dummies Questions & Answers
Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings.
It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies
8. Shell Programming and Scripting
Hello,
I'm writing a shell script to wget content web pages from multiple server into a variable and compare
if they match return 0 or return 2
#!/bin/bash
# Cluster 1
CLUSTER1_SERVERS="srv1 srv2 srv3 srv4"
CLUSTER1_APPLIS="test/version.html test2.version.jsp"
# Liste des... (4 Replies)
Discussion started by: gtam
4 Replies
9. Shell Programming and Scripting
Hello
I'm writing a script to get content of web pages on different machines and compare them using their md5 hash
hear is my code
#!/bin/bash
# Cluster 1
CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054"
CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies
LEARN ABOUT DEBIAN
gedcom::webservices
Gedcom::WebServices(3pm) User Contributed Perl Documentation Gedcom::WebServices(3pm)
NAME
Gedcom::WebServices - Basic web service routines for Gedcom.pm
Version 1.16 - 24th April 2009
SYNOPSIS
wget -qO - http://www.example.com/ws/plain/my_family/i9/name
DESCRIPTION
This module provides web service access to a GEDCOM file in conjunction with mod_perl. Using it, A request for imformation can be made in
the form of a URL specifying the GEDCOM file to be used, which information is required and the format in which the information is to be
delivered. This information is then returned in the specified format.
There are currently three supported formats:
o plain - no markup
o XML
o JSON
URLs
The format of the URLs used to access the web services are:
$BASEURL/$FORMAT/$GEDCOM/$XREF/requested/information
$BASEURL/$FORMAT/$GEDCOM?search=search_criteria
BASEURL
The base URL to access the web services.
FORMAT
The format in which to return the results.
GEDCOM
The name of the GEDCOM file to use (the extension .ged is assumed).
XREF
The xref of the record about which information is required. XREFs can be obtained initially from a search, and subsequently from
certain queries.
requested/information
The information requested. This is in the same format as that taken by the get_value method.
search_criteria
An individual to search for. This is in the same format as that taken by the get_individual method.
EXAMPLES
$ wget -qO - 'http://pjcj.sytes.net:8585/ws/plain/royal92?search=elizabeth_ii'
/ws/plain/royal92/I52
$ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52
0 @I52@ INDI
1 NAME Elizabeth_II Alexandra Mary/Windsor/
1 TITL Queen of England
1 SEX F
1 BIRT
2 DATE 21 APR 1926
2 PLAC 17 Bruton St.,London,W1,England
1 FAMS @F14@
1 FAMC @F12@
$ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52/name
Elizabeth_II Alexandra Mary /Windsor/
$ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52/birth/date
21 APR 1926
$ wget -qO - http://pjcj.sytes.net:8585/ws/plain/royal92/I52/children
/ws/plain/royal92/I58
/ws/plain/royal92/I59
/ws/plain/royal92/I60
/ws/plain/royal92/I61
$ wget -qO - http://pjcj.sytes.net:8585/ws/json/royal92/I52/name
{"name":"Elizabeth_II Alexandra Mary /Windsor/"}
$ wget -qO - http://pjcj.sytes.net:8585/ws/xml/royal92/I52/name
<NAME>Elizabeth_II Alexandra Mary /Windsor/</NAME>
$ wget -qO - http://pjcj.sytes.net:8585/ws/xml/royal92/I52
<INDI ID="I52">
<NAME>Elizabeth_II Alexandra Mary/Windsor/</NAME>
<TITL>Queen of England</TITL>
<SEX>F</SEX>
<BIRT>
<DATE>21 APR 1926</DATE>
<PLAC>17 Bruton St.,London,W1,England</PLAC>
</BIRT>
<FAMS REF="F14"/>
<FAMC REF="F12"/>
</INDI>
CONFIGURATION
Add a section similar to the following to your mod_perl config:
PerlWarn On
PerlTaintCheck On
PerlPassEnv GEDCOM_TEST
<IfDefine GEDCOM_TEST>
<Perl>
$Gedcom::TEST = 1;
</Perl>
</IfDefine>
<Perl>
use Apache::Status;
$ENV{PATH} = "/bin:/usr/bin";
delete @ENV{"IFS", "CDPATH", "ENV", "BASH_ENV"};
$Gedcom::DATA = $Gedcom::ROOT; # location of data stored on server
use lib "$Gedcom::ROOT/blib/lib";
use Gedcom::WebServices;
my $handlers =
[ qw
(
plain
xml
json
)
];
eval Gedcom::WebServices::_set_handlers($handlers);
# use Apache::PerlSections; print STDERR Apache::PerlSections->dump;
</Perl>
PerlTransHandler Gedcom::WebServices::_parse_uri
BUGS
Very probably.
See the BUGS file. And the TODO file.
VERSION
Version 1.16 - 24th April 2009
LICENCE
Copyright 2005-2009, Paul Johnson (paul@pjcj.net)
This software is free. It is licensed under the same terms as Perl itself.
The latest version of this software should be available from my homepage: http://www.pjcj.net
perl v5.14.2 2012-04-12 Gedcom::WebServices(3pm)