Sponsored Content
Top Forums Shell Programming and Scripting Running web pages located outside webserver space Post 302469680 by Scott on Sunday 7th of November 2010 05:26:53 PM
Old 11-07-2010
Using aliases?

From httpd.conf:
Code:
    # Alias: Maps web paths into filesystem paths and is used to
    # access content that does not live under the DocumentRoot.
    # Example:
    # Alias /webpath /full/filesystem/path

This User Gave Thanks to Scott For This Post:
 

10 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Dynamic web pages for Unix Web Server

Hi, my company is considering a new development of our web site, which used to run on Apachi over Solaris. The company who is going to do this for us knows only about developing it in ASP. I guess this means we'll have to have another ISS server on NT for these dynamic pages :( What are... (5 Replies)
Discussion started by: me2unix
5 Replies

2. Shell Programming and Scripting

Count links in all of my web pages

Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" . i want the output to look something like this: Total number of web pages: (number) Total number of links: (number) Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies

3. UNIX for Dummies Questions & Answers

Selecting information from several web pages...

Hi All! Is this possible? I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these... http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies

4. Shell Programming and Scripting

Investigating web pages in awk

hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies

5. UNIX for Dummies Questions & Answers

curl command with web pages

I can't quite seem to understand what the curl command does with a web address. I tried this: curl O'Reilly Media: Tech Books, Conferences, Courses, News but I just got the first few lines of a web page, and it's nowhere on my machine. Can someone elaborate? (2 Replies)
Discussion started by: Straitsfan
2 Replies

6. UNIX for Dummies Questions & Answers

Forcing web pages to anti-aliase

Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings. It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies

7. Shell Programming and Scripting

Checking Web Pages?

Hey guys, Unfortunatley, I can not use wget on our systems.... I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application. Has anyone saw this before? Thanks, Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies

8. Shell Programming and Scripting

Get web pages and compare

Hello, I'm writing a shell script to wget content web pages from multiple server into a variable and compare if they match return 0 or return 2 #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv1 srv2 srv3 srv4" CLUSTER1_APPLIS="test/version.html test2.version.jsp" # Liste des... (4 Replies)
Discussion started by: gtam
4 Replies

9. Shell Programming and Scripting

Get web pages and compare

Hello I'm writing a script to get content of web pages on different machines and compare them using their md5 hash hear is my code #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054" CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies

10. UNIX for Beginners Questions & Answers

Help with running a script on files located in subdirectories

Hello everyone, I'm am a newbie to coding so I am reaching out in hopes that I can get some help from this forum. I am trying to run the script below from a single directory, however the directory has many subfolders. In each of those subfolders is a file, uniquely named to that specific... (3 Replies)
Discussion started by: azurite
3 Replies
WWW::Mechanize::GZip(3pm)				User Contributed Perl Documentation				 WWW::Mechanize::GZip(3pm)

NAME
WWW::Mechanize::GZip - tries to fetch webpages with gzip-compression VERSION
Version 0.10 SYNOPSIS
use WWW::Mechanize::GZip; my $mech = WWW::Mechanize::GZip->new(); my $response = $mech->get( $url ); print "x-content-length (before unzip) = ", $response->header('x-content-length'); print "content-length (after unzip) = ", $response->header('content-length'); DESCRIPTION
The WWW::Mechanize::GZip module tries to fetch a URL by requesting gzip-compression from the webserver. If the response contains a header with 'Content-Encoding: gzip', it decompresses the response in order to get the original (uncompressed) content. This module will help to reduce bandwith fetching webpages, if supported by the webeserver. If the webserver does not support gzip- compression, no decompression will be made. This modules is a direct subclass of WWW::Mechanize and will therefore support any methods provided by WWW::Mechanize. The decompression is handled by Compress::Zlib::memGunzip. There is a small webform, you can instantly test, whether a webserver supports gzip-compression on a particular URL: <http://www.computerhandlung.de/www-mechanize-gzip.htm> METHODS prepare_request Adds 'Accept-Encoding' => 'gzip' to outgoing HTTP-headers before sending. send_request Unzips response-body if 'content-encoding' is 'gzip' and corrects 'content-length' to unzipped content-length. SEE ALSO
WWW::Mechanize Compress::Zlib AUTHOR
Peter Giessner "cardb@planet-elektronik.de" LICENCE AND COPYRIGHT
Copyright (c) 2007, Peter Giessner "cardb@planet-elektronik.de". All rights reserved. This module is free software; you can redistribute it and/or modify it under the same terms as Perl itself. perl v5.10.0 2009-06-24 WWW::Mechanize::GZip(3pm)
All times are GMT -4. The time now is 05:30 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy