Sponsored Content
Full Discussion: check web content - Ksh
Operating Systems AIX check web content - Ksh Post 302451544 by funksen on Tuesday 7th of September 2010 07:31:46 AM
Old 09-07-2010
I would use wget

is part of the aix toolbox

IBM AIX Toolbox download information
 

4 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Web content download

There is a web API https://pi.ercot.com/contentproxy/publicList?folder_id=10001937 As soon as this is fired on the browser a page comes in which many files are listed for downloads I want to download all the files ....is it possible to write a shell script which can download the files (1 Reply)
Discussion started by: viv1
1 Replies

2. Shell Programming and Scripting

How to check whether the web page is up without dowloading any content

Hi I am writing a script to check wheather the web page is up and running. I dont want to download the web page.. I I used wget --spider command. but i am not getting anything. can some one tell me the command to the web page.. if the page is up and running I should send a mail to my... (19 Replies)
Discussion started by: ahamed
19 Replies

3. Shell Programming and Scripting

Getting web page content and deliver via mail

Guys could you please help. I want to have shell scripts that get the website html content and same time mail html page to DL. Can any1 help here? (2 Replies)
Discussion started by: AnkitC
2 Replies

4. Shell Programming and Scripting

Trying to submit web form content to a shell script

Hi I was hoping some one could help me with a problem I have. I am trying to collect some information from a web form and save it to a text file. I found an example on this site that is sort of what I am trying to accomplish, the shell script bellow should echo the input back to the browser... (0 Replies)
Discussion started by: Paul Walker
0 Replies
MIRRORTOOL(1)							OMT documentation.						     MIRRORTOOL(1)

NAME
mirrortool.pl - OpaL Mirror Tool (OMT) DESCRIPTION
Creates a mirror of a webpage. It has a number of features such as link rewriting and more. (See the options below). USAGE
mirrortool.pl [options] [url] [options] [url] [...] OPTIONS
--images : Include <img src=xxx>:s in the download. (default) --noimages : Do not include <img src=xxx>:s in the download. --depth n : Maximum recursion depth. (default 1) --store "regexp" : Files matching regexp are actually stored locally. : It is possible to | separate (with or). --rewrite "from=>to" : Urls are rewritten using this rules. : It is possible to | separate (with or). : Do not rewrite the dir, because that it will affect : later lookup. Have to fix this sometime. --what "regexp" : Files matching regexp are downloaded and traversed. : It is possible to | separate (with or). --dir basedir : Where to store local files. --nohostcheck : Do not check if url points to other host. --notreecheck : Do not check if url points to other dirtree. --force : Overwrite all files. --debug : Print debug-messages. --retry n : Number of times an url will be retried (default 1) --auth user:pass : use Basic Authentication --proxy url : Use a proxy server (like http://u:p@localhost/). --help : Print this text. AUTHOR
Ola Lundqvist <opal@lysator.liu.se> SEE ALSO
mirrortool.pl(1) perl v5.8.8 2002-04-15 MIRRORTOOL(1)
All times are GMT -4. The time now is 06:51 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy