Sponsored Content
Top Forums Shell Programming and Scripting Monitoring an html web page changes Post 302737845 by prvnrk on Thursday 29th of November 2012 09:42:56 PM
Old 11-29-2012
wget is NOT working at all because sometimes the downloaded HTML file size is getting different (few bytes) even though no changes in the web page.

It's weird and I don't think we can rely on wget for this.

Any suggestions would highly be appreciated.

Thanks!
 

10 More Discussions You Might Find Interesting

1. UNIX and Linux Applications

Html web page to Unix Connectivity

Hi All, I need a basic overview of connecting a HTML web page to Unix I will give a brief of my exact requirement. There will be a front end HTML page - a web page which will have certain buttons. Each button will have certain functionality. For eg: There is a button for Disk Usage. When the... (1 Reply)
Discussion started by: abhilashnair
1 Replies

2. Solaris

Accessing a HTML page

Hi All, In our unix server we have an apache web server running. I can access the default apache web page from my windows machine. Now, I want to create my own webpage. Therefore I created webpage at /export/home/myname/test.html file. Where do I need to place this file and what do I need... (0 Replies)
Discussion started by: pkm_oec
0 Replies

3. UNIX for Dummies Questions & Answers

Accessing a HTML page

Hi All, In our unix server we have an apache web server running. I can access the default apache web page from my windows machine. Now, I want to create my own webpage. Therefore I created webpage at /export/home/myname/test.html file. Where do I need to place this file and what do I need... (2 Replies)
Discussion started by: pkm_oec
2 Replies

4. Web Development

findstr in html page

I am planning to create an html page that will count number of connected ports, challenge for me is how to put it in a page. Thanks! (1 Reply)
Discussion started by: webmunkey23
1 Replies

5. Web Development

Call shell script from HTML page - without web server

Hi, I have html page in my unix machine(server), which I will open with firefox or mozilla available in unix machine. Firefox or mozilla will be opened using x windows. Since I have access to unix machien(like other users) and this HTML page is for user having access to Unix machine, I see no... (7 Replies)
Discussion started by: vamanu9
7 Replies

6. UNIX for Dummies Questions & Answers

Publishing HTML Page

Hi All, Thanks for reading. I am not sure if I am asking this in the correct group. But here it goes: There is a shell script which does some system checks and creates an html file called system_summary.html on my Red Hat machine say in /reports directory every hour. Now I want to view it... (1 Reply)
Discussion started by: deepakgang
1 Replies

7. Red Hat

Publishing HTML Page

Hi All, Thanks for reading. I am not sure if I am asking this in the correct group. But here it goes: There is a shell script which does some system checks and creates an html file called system_summary.html on my Red Hat machine say in /reports directory every hour. Now I want to view it... (6 Replies)
Discussion started by: deepakgang
6 Replies

8. Shell Programming and Scripting

Attached HTML page to Email

Greeting all, Not sure anyone tested to send out email with HTML page as attachment from Shell Script ? I know if I use uuencode file.html approach, the mail receive in attachment is empty. So I guess uuencode cannot be use for the html code. Appreciate if anyone can share the code to... (0 Replies)
Discussion started by: ckwan
0 Replies

9. AIX

How to Use a UNIX Shell Script to Create an HTML Web Page?

dear friends , in my work i have to monitor some system performance in hourly basis by runing some commands , for example (lpstat) to know that all the queue is ready how can i create webpage and connect it with the server (AIX operating system) and make this page refreshed every 10 second and... (12 Replies)
Discussion started by: rami abusweilei
12 Replies

10. Shell Programming and Scripting

Accessing the html page

Hi All, In our unix server we have an apache web server running. Now, I want to create my own webpage. Therefore I created webpage at /export/home/test.html file. Where do I need to place this file and what do I need mention this page in my web browser to access it. Without apache... (1 Reply)
Discussion started by: Arasu
1 Replies
httpindex(1)						      General Commands Manual						      httpindex(1)

NAME
httpindex - HTTP front-end for SWISH++ indexer SYNOPSIS
wget [ options ] URL... 2>&1 | httpindex [ options ] DESCRIPTION
httpindex is a front-end for index++(1) to index files copied from remote servers using wget(1). The files (in a copy of the remote direc- tory structure) can be kept, deleted, or replaced with their descriptions after indexing. OPTIONS
wget Options The wget(1) options that are required are: -A, -nv, -r, and -x; the ones that are highly recommended are: -l, -nh, -t, and -w. (See the EXAMPLE.) httpindex Options httpindex accepts the same short options as index++(1) except for -H, -I, -l, -r, -S, and -V. The following options are unique to httpindex: -d Replace the text of local copies of retrieved files with their descriptions after they have been indexed. This is useful to display file descriptions in search results without having to have complete copies of the remote files thus saving filesystem space. (See the extract_description() function in WWW(3) for details about how descriptions are extracted.) -D Delete the local copies of retrieved files after they have been indexed. This prevents your local filesystem from filling up with copies of remote files. EXAMPLE
To index all HTML and text files on a remote web server keeping descriptions locally: wget -A html,txt -linf -t2 -rxnv -nh -w2 http://www.foo.com 2>&1 | httpindex -d -e'html:*.html,text:*.txt' Note that you need to redirect wget(1)'s output from standard error to standard output in order to pipe it to httpindex. EXIT STATUS
Exits with a value of zero only if indexing completed sucessfully; non-zero otherwise. CAVEATS
In addition to those for index++(1), httpindex does not correctly handle the use of multiple -e, -E, -m, or -M options (because the Perl script uses the standard GetOpt::Std package for processing command-line options that doesn't). The last of any of those options ``wins.'' The work-around is to use multiple values for those options seperated by commas to a single one of those options. For example, if you want to do: httpindex -e'html:*.html' -e'text:*.txt' do this instead: httpindex -e'html:*.html,text:*.txt' SEE ALSO
index++(1), wget(1), WWW(3) AUTHOR
Paul J. Lucas <pauljlucas@mac.com> SWISH++ August 2, 2005 httpindex(1)
All times are GMT -4. The time now is 01:53 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy