Sponsored Content
Operating Systems AIX How to Use a UNIX Shell Script to Create an HTML Web Page? Post 302747851 by Yoda on Sunday 23rd of December 2012 10:59:23 AM
Old 12-23-2012
If you are not familiar with CGI then I suggest to run a shell script that executes lpstat every 10 seconds.

You can code an infinite while loop that executes lpstat and sleep for 10 seconds. You have to copy the output and publish the output in your local IIS web-server or FTP it and publish if your IIS web-server is running on a remote machine:-
Code:
while true
do
    # Run lpstat here and redirect output to a HTML file.
    # FTP or copy this HTML file to publish it in your IIS web-server 
    sleep 10 # Sleep for 10 seconds.
done

 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

running shell script thru WEB page ....

....passing variable via list... here 's the HTML code extract : **************** <form method=post action=http://servername/cgi-bin/cgi-comptage_diff.ksh> <table border...........> .............. </table> <table bgcolor=#FFFFFF width="980"> ... (6 Replies)
Discussion started by: Nicol
6 Replies

2. Shell Programming and Scripting

Invoking shell script from html/jsp page

I want to invoke shell script named bubesh.sh when submit button clicked on html/jsp page.I am using an apache server and the html & shell script are in the same working directory.Please help. (2 Replies)
Discussion started by: bubeshj
2 Replies

3. UNIX and Linux Applications

Html web page to Unix Connectivity

Hi All, I need a basic overview of connecting a HTML web page to Unix I will give a brief of my exact requirement. There will be a front end HTML page - a web page which will have certain buttons. Each button will have certain functionality. For eg: There is a button for Disk Usage. When the... (1 Reply)
Discussion started by: abhilashnair
1 Replies

4. Shell Programming and Scripting

how to redirect to a web-page by shell script

Dear all, I am calling a korn shell script(CGI script) by a web-page. This shell script do some checking in a unix file and return true or false. Now within the same script, If it returns true then I want to redirect to another web-page stored in htdocs directory. Example: Login page sends a... (3 Replies)
Discussion started by: ravi18s
3 Replies

5. Web Development

Call shell script from HTML page - without web server

Hi, I have html page in my unix machine(server), which I will open with firefox or mozilla available in unix machine. Firefox or mozilla will be opened using x windows. Since I have access to unix machien(like other users) and this HTML page is for user having access to Unix machine, I see no... (7 Replies)
Discussion started by: vamanu9
7 Replies

6. HP-UX

Help running a unix script from a web page

First, let me state that I am completely out of my realm with this. I have a server running HPUX. I'm not even sure if this can be considered a UNIX question and for that let me apologize in advance. I need to create a web page where a client can input 2 variables (i.e. date and phone number).... (0 Replies)
Discussion started by: grinds
0 Replies

7. Shell Programming and Scripting

Migrating from Shell Script to HTML Page

Hi, Need Help, Recently I have thought to migrating my Korn Shell Scripts to html page..., already webserv is running on my unix machine. How to migrate the shell scripts to html page.. Please refer any web portal or sample codes. Thanks in Adavce (2 Replies)
Discussion started by: l_gshankar24
2 Replies

8. Shell Programming and Scripting

Dynamic checkbox(HTML page) for each process using shell script

Hi friends, I m newbie to bash scripting , i m writing script(bash) that will display all processes which r running in my system in an html page, everything going fine but i m not able to get checkbox dynamically for each process in html page ,so that i can mark that process... (2 Replies)
Discussion started by: vagga06
2 Replies

9. Shell Programming and Scripting

Monitoring an html web page changes

Hello, I need to monitor an html web page for ANY changes and should be able to know if it's modified or not (since last query). I do not need what modifications but just notification is enough. This is a simple web page and I don't need to parse the links any further. Is it possible to do... (10 Replies)
Discussion started by: prvnrk
10 Replies

10. Shell Programming and Scripting

How to generate HTML page from UNIX script out-put?

Hi All. This my first post to this forum, and i assuming it will be best out-of all. I am quite new to Unix scripting so please excuse me for any silly questions - I am trying to create on Unix script in which it telnet to my server, check the connectivity of the server and then it... (2 Replies)
Discussion started by: HHarsh
2 Replies
LWP-RGET(1)						User Contributed Perl Documentation					       LWP-RGET(1)

NAME
lwp-rget - Retrieve WWW documents recursively SYNOPSIS
lwp-rget [--verbose] [--auth=USER:PASS] [--depth=N] [--hier] [--iis] [--keepext=mime/type[,mime/type]] [--limit=N] [--nospace] [--prefix=URL] [--referer=URL] [--sleep=N] [--tolower] <URL> lwp-rget --version DESCRIPTION
This program will retrieve a document and store it in a local file. It will follow any links found in the document and store these docu- ments as well, patching links so that they refer to these local copies. This process continues until there are no more unvisited links or the process is stopped by the one or more of the limits which can be controlled by the command line arguments. This program is useful if you want to make a local copy of a collection of documents or want to do web reading off-line. All documents are stored as plain files in the current directory. The file names chosen are derived from the last component of URL paths. The options are: --auth=USER:PASn Set the authentication credentials to user "USER" and password "PASS" if any restricted parts of the web site are hit. If there are restricted parts of the web site and authentication credentials are not available, those pages will not be downloaded. --depth=n Limit the recursive level. Embedded images are always loaded, even if they fall outside the --depth. This means that one can use --depth=0 in order to fetch a single document together with all inline graphics. The default depth is 5. --hier Download files into a hierarchy that mimics the web site structure. The default is to put all files in the current directory. --referer=URI Set the value of the referer header for the initial request. The special value "NONE" can be used to suppress the referer header in any of subsequent requests. --iis Sends an "Accept: */*" on all URL requests as a workaround for a bug in IIS 2.0. If no Accept MIME header is present, IIS 2.0 returns with a "406 No acceptable objects were found" error. Also converts any back slashes (\) in URLs to forward slashes (/). --keepext=mime/type[,mime/type] Keeps the current extension for the list MIME types. Useful when downloading text/plain documents that shouldn't all be translated to *.txt files. --limit=n Limit the number of documents to get. The default limit is 50. --nospace Changes spaces in all URLs to underscore characters (_). Useful when downloading files from sites serving URLs with spaces in them. Does not remove spaces from fragments, e.g., "file.html#somewhere in here". --prefix=url_prefix Limit the links to follow. Only URLs that start the prefix string are followed. The default prefix is set as the "directory" of the initial URL to follow. For instance if we start lwp-rget with the URL "http://www.sn.no/foo/bar.html", then prefix will be set to "http://www.sn.no/foo/". Use "--prefix=''" if you don't want the fetching to be limited by any prefix. --sleep=n Sleep n seconds before retrieving each document. This options allows you to go slowly, not loading the server you visiting too much. --tolower Translates all links to lowercase. Useful when downloading files from IIS since it does not serve files in a case sensitive manner. --verbose Make more noise while running. --quiet Don't make any noise. --version Print program version number and quit. --help Print the usage message and quit. Before the program exits the name of the file, where the initial URL is stored, is printed on stdout. All used filenames are also printed on stderr as they are loaded. This printing can be suppressed with the --quiet option. SEE ALSO
lwp-request, LWP AUTHOR
Gisle Aas <aas@sn.no> libwww-perl-5.65 2002-01-02 LWP-RGET(1)
All times are GMT -4. The time now is 01:46 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy