11-06-2002
Selecting information from several web pages...
Hi All!
Is this possible?
I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these...
http://docs.hp.com/hpux/onlinedocs/B...00/31-con.html
http://docs.hp.com/hpux/onlinedocs/B...00/34-con.html
http://docs.hp.com/hpux/onlinedocs/B...3/331-con.html
I know that all these pages follow a certain pattern in their layout. I want to make a small consolidated report of all hp-ux commands listed in these pages with only their, say descriptions, examples etc...
If I have a command which works in a loop on such urls and in each turn return me the page contents, I can filter out sections which I want...
Is this possible? Any hint is highly appreciated...
Also is there a UNIX utility which converts html to simple readable text?
Cheers!
Vishnu.
9 More Discussions You Might Find Interesting
1. UNIX for Dummies Questions & Answers
Hi,
my company is considering a new development of our web site, which used to run on Apachi over Solaris.
The company who is going to do this for us knows only about developing it in ASP.
I guess this means we'll have to have another ISS server on NT for these dynamic pages :(
What are... (5 Replies)
Discussion started by: me2unix
5 Replies
2. Shell Programming and Scripting
Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" .
i want the output to look something like this:
Total number of web pages: (number)
Total number of links: (number)
Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies
3. UNIX for Dummies Questions & Answers
Is there any way to browse web pages while on the command line?
I know wget can download pages, but I was wondering if there was an option other than that. (2 Replies)
Discussion started by: vroomlicious
2 Replies
4. Shell Programming and Scripting
hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies
5. UNIX for Dummies Questions & Answers
I can't quite seem to understand what the curl command does with a web address. I tried this:
curl O'Reilly Media: Tech Books, Conferences, Courses, News
but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies
6. UNIX for Dummies Questions & Answers
Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings.
It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies
7. Shell Programming and Scripting
Hey guys,
Unfortunatley, I can not use wget on our systems....
I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application.
Has anyone saw this before?
Thanks,
Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies
8. Shell Programming and Scripting
Hello,
I'm writing a shell script to wget content web pages from multiple server into a variable and compare
if they match return 0 or return 2
#!/bin/bash
# Cluster 1
CLUSTER1_SERVERS="srv1 srv2 srv3 srv4"
CLUSTER1_APPLIS="test/version.html test2.version.jsp"
# Liste des... (4 Replies)
Discussion started by: gtam
4 Replies
9. Shell Programming and Scripting
Hello
I'm writing a script to get content of web pages on different machines and compare them using their md5 hash
hear is my code
#!/bin/bash
# Cluster 1
CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054"
CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies
LEARN ABOUT MOJAVE
ftp_geturl
ftp::geturl(n) ftp client ftp::geturl(n)
__________________________________________________________________________________________________________________________________________________
NAME
ftp::geturl - Uri handler for ftp urls
SYNOPSIS
package require Tcl 8.2
package require ftp::geturl ?0.2.1?
::ftp::geturl url
_________________________________________________________________
DESCRIPTION
This package provides a command which wraps around the client side of the ftp protocol provided by package ftp to allow the retrieval of
urls using the ftp schema.
API
::ftp::geturl url
This command can be used by the generic command ::uri::geturl (See package uri) to retrieve the contents of ftp urls. Internally it
uses the commands of the package ftp to fulfill the request.
The contents of a ftp url are defined as follows:
file The contents of the specified file itself.
directory
A listing of the contents of the directory in key value notation where the file name is the key and its attributes the asso-
ciated value.
link The attributes of the link, including the path it refers to.
BUGS, IDEAS, FEEDBACK
This document, and the package it describes, will undoubtedly contain bugs and other problems. Please report such in the category ftp of
the Tcllib SF Trackers [http://sourceforge.net/tracker/?group_id=12883]. Please also report any ideas for enhancements you may have for
either package and/or documentation.
SEE ALSO
ftpd, mime, pop3, smtp
KEYWORDS
ftp, internet, net, rfc 959
CATEGORY
Networking
ftp 0.2.1 ftp::geturl(n)