07-16-2001
Unfortunately you are going to need to know what web server software is running, only for the reason that the config file will be software specific - if it's AIX it may well be websphere if they have chosen to run proprietary IBM web server software, however it may be another piece of software such as Netscape (iPlanet).
Either way this config file will contain what is called the document root for a web instance - this will specify which directory to use as the root directory to serve pages for that web instance.
In Netscape Enterprise software this is located in the obj.conf file, don't know about web sphere - you may want to try and grep for "document-root", use the -i option with the find command and you might be able to track the file down.
something like
find / -type f -exec grep -il "document-root" {} \;
The -l will list the file for you.
Hope this helps.
Regards.
9 More Discussions You Might Find Interesting
1. UNIX for Advanced & Expert Users
Hi all,
I need to write a unix script in which need to call a url.
Then need to pass parameters to that url.
please help.
Regards,
gander_ss (1 Reply)
Discussion started by: gander_ss
1 Replies
2. Shell Programming and Scripting
Hi all,
I need to write a unix script in which need to call a url.
Then need to pass parameters to that url.
please help.
Regards,
gander_ss (1 Reply)
Discussion started by: gander_ss
1 Replies
3. UNIX for Dummies Questions & Answers
Hello,
I need to redirect an existing URL, how can i do that?
There's a current web address to a GUI that I have to redirect to another webaddress. Does anyone know how to do this?
This is on Unix boxes Linux.
example:
https://m45.testing.address.net/host.php
make it so the... (3 Replies)
Discussion started by: SkySmart
3 Replies
4. Web Development
I'd like to translate a friendly url such as:
http://www.xxxyyyzzz.com/page/12345678/
to:
http://www.xxxyyyzzz.com/page/12/34/56/78/
Seems simple enough, but I cannot figure out how. Any one done this before? (2 Replies)
Discussion started by: markericksen
2 Replies
5. Web Development
I am trying to find a way to test some code, but I need to rewrite a specific URL only from a specific HTTP_HOST
The call goes out to
http://SUB.DOMAIN.COM/showAssignment/7bde10b45efdd7a97629ef2fe01f7303/jsmodule/Nevow.Athena
The ID in the middle is always random due to the cookie.
I... (5 Replies)
Discussion started by: EXT3FSCK
5 Replies
6. UNIX for Dummies Questions & Answers
Here is what I have so far:
find . -name "*php*" -or -name "*htm*" | xargs grep -i iframe | awk -F'"' '/<iframe*/{gsub(/.\*iframe>/,"\"");print $2}'
Here is an example content of a PHP or HTM(HTML) file:
<iframe src="http://ADDRESS_1/?click=5BBB08\" width=1 height=1... (18 Replies)
Discussion started by: striker4o
18 Replies
7. Shell Programming and Scripting
Hello,
Am very new to perl , please help me here !!
I need help in reading a URL from command line using PERL:: Mechanize and needs all the contents from the URL to get into a file.
below is the script which i have written so far ,
#!/usr/bin/perl
use LWP::UserAgent;
use... (2 Replies)
Discussion started by: scott_cog
2 Replies
8. UNIX for Advanced & Expert Users
Currently I am using this laborious command
lvdisplay | awk '/LV Path/ {p=$3} /LV Name/ {n=$3} /VG Name/ {v=$3} /Block device/ {d=$3; sub(".*:", "/dev/dm-", d); printf "%s\t%s\t%s\n", p, "/dev/mapper/"v"-"n, d}'
Would like to know if there is any shorter method to get this mapping of... (2 Replies)
Discussion started by: royalibrahim
2 Replies
9. Shell Programming and Scripting
sometime the user gives me the Linux NAS path URL which is accessible usings windows explorer like below:
This URL translates to the below physical path on Linux host
Below is what I wish to achieve:
1. Detect if the path provided is NAS URL starting with "\" or a Physical Linux path... (7 Replies)
Discussion started by: mohtashims
7 Replies
LEARN ABOUT DEBIAN
wapiti
WAPITI(1) User Commands WAPITI(1)
NAME
wapiti - a web application vulnerability scanner.
SYNOPSIS
wapiti http://server.com/base/url/ [options]
DESCRIPTION
Wapiti allows you to audit the security of your web applications.
It performs "black-box" scans, i.e. it does not study the source code of the application but will scans the webpages of the deployed
webapp, looking for scripts and forms where it can inject data. Once it gets this list, Wapiti acts like a fuzzer, injecting payloads to
see if a script is vulnerable.
OPTIONS
-s, --start <url>
specify an url to start with.
-x, --exclude <url>
exclude an url from the scan (for example logout scripts) you can also use a wildcard (*):
Example : -x "http://server/base/?page=*&module=test" or -x "http://server/base/admin/*" to exclude a directory
-p, --proxy <url_proxy>
specify a proxy (-p http://proxy:port/)
-c, --cookie <cookie_file>
use a cookie
-t, --timeout <timeout>
set the timeout (in seconds)
-a, --auth <login%password>
set credentials (for HTTP authentication) doesn't work with Python 2.4
-r, --remove <parameter_name>
removes a parameter from URLs
-m, --module <module>
use a predefined set of scan/attack options:
GET_ALL: only use GET request (no POST)
GET_XSS: only XSS attacks with HTTP GET method
POST_XSS: only XSS attacks with HTTP POST method
-u, --underline
use color to highlight vulnerable parameters in output
-v, --verbose <level>
set the verbosity level:
0: quiet (default),
1: print each url,
2: print every attack
-h, --help
print help page
EFFICIENCY
Wapiti is developed in Python and use a library called lswww. This web spider library does the most of the work. Unfortunately, the html
parsers module within python only works with well formed html pages so lswww fails to extract information from bad-coded webpages. Tidy can
clean these webpages on the fly for us so lswww will give pretty good results. In order to make Wapiti far more efficient, you should:
apt-get install python-utidylib python-ctypes
AUTHOR
Copyright (C) 2006-2007 Nicolas Surribas <nicolas.surribas@gmail.com>
Manpage created by Thomas Blasing <thomasbl@pool.math.tu-berlin.de>
http://wapiti.sourceforge.net/ July 2007 WAPITI(1)