Sponsored Content
Top Forums Shell Programming and Scripting AWK script to detect webpages from file Post 302406264 by pludi on Monday 22nd of March 2010 12:22:35 PM
Old 03-22-2010
Double post, continued here, thread closed.
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Unix script to detect new file entry in directory

Hi All, I want to detect each new file coming / getting created in unix directory. When every new file came to directory, i have to get its details like its size , date and time stamp and store it into another file. Could any one please tell me , how i can achieve that? Thanks. (13 Replies)
Discussion started by: james_1984
13 Replies

2. UNIX for Dummies Questions & Answers

how to detect my script is already running

I have a script which must not be run more than once at any given time. THis script will be scheduled to run every 20 mins as a cron job. In my script can i have logic to say if this script is already running from the previous cron, then exit. How do i go about doing that. If you describe the... (11 Replies)
Discussion started by: rmulchandani
11 Replies

3. Shell Programming and Scripting

AWK script to detect webpages from file

Hi guys I'm very new to unix and I have to create an awk script that detects webpage addresses from a file/webpage and outputs how many times each webpage was detected.e.g. if my file was: www.google.com www.facebook.com www.google.com the output should be: www.google.com x2... (2 Replies)
Discussion started by: ROFL
2 Replies

4. Shell Programming and Scripting

script to detect a file from inserted usb and puts into a Variable

There is a same named log file that I have on my 2 different android phones. When I plug it into my computer, it appears in the media folder, For example the first android phone: /media/F6BA-0AF5/folder/A.log I want to put that into a variable to be manipulated.... (3 Replies)
Discussion started by: tobenguyen
3 Replies

5. Shell Programming and Scripting

Script to detect dynamic ip change and update to config file

Hi All, I am newbie here and request your assistance. I have a service running on public ip, but since I have a dynamic IP it keeps on changing and every time I need to manually get the new ip and add to the config file and restart the service. This has become bit time consuming. Hence, I... (4 Replies)
Discussion started by: Shaan_Shaan
4 Replies

6. Shell Programming and Scripting

How to detect empty field in awk ?

Hi ! programmers I have a need of detecting empty field in file my file looks like this 40.900|-71.600|1.6|20|1|1961|21.00|3.700||1|US|28035|10029370|31 40.900|-71.600|5.7|20|1|1961|21.00|3.700||1|US|28035|10029370|31 40.900|-71.600|7.8|20|1|1961|21.00|3.700||1|US|28035|10029370|31... (7 Replies)
Discussion started by: Dona Clara
7 Replies

7. Shell Programming and Scripting

Help with detect with regex and move script

Hi all, I am needing some help with a script that will search for a video file by known extensions and then do a pattern search (I'm guessing via regex) and then based on a match of one type of another move the file to an assigned directory. I would like to do this with either a shell script... (7 Replies)
Discussion started by: Simplify
7 Replies

8. Shell Programming and Scripting

How to detect awk and nawk?

I have 10 scripts which use awk/nawk extensively. I have to always change awk to nawk and nawk to awk when i deploy all my scripts to different types of servers as some support nawk while other support awk. Can you propose a solution that without having to tweak my 10 scripts at several... (8 Replies)
Discussion started by: mohtashims
8 Replies

9. Shell Programming and Scripting

awk script to detect specific string in a log file and count it

Hello, can someone guide me on this? I don't know what is the best approach, (awk script, shell script) I am using RedHat Linux version 6.5. There is a third party application deployed on that server. This app by default generates 5 log files and each file is 20MB. These log rollover... (5 Replies)
Discussion started by: ktisbest
5 Replies

10. Shell Programming and Scripting

How to detect url in use in a script?

Hello, I have a small script and it runs from web application in below format: pipe:///path_to_myscript.sh url1 url2 url3 myscript.sh: #!/bin/bash count=0 while do count=$((count+1)) exec 3>&1 ((ffmpeg -i $1 ...... -f mpegts pipe:1 2>/dev/null 1>&3 ) 2>&1 | \ while read LINE; do echo... (9 Replies)
Discussion started by: baris35
9 Replies
WAPITI(1)							   User Commands							 WAPITI(1)

NAME
wapiti - a web application vulnerability scanner. SYNOPSIS
wapiti http://server.com/base/url/ [options] DESCRIPTION
Wapiti allows you to audit the security of your web applications. It performs "black-box" scans, i.e. it does not study the source code of the application but will scans the webpages of the deployed webapp, looking for scripts and forms where it can inject data. Once it gets this list, Wapiti acts like a fuzzer, injecting payloads to see if a script is vulnerable. OPTIONS
-s, --start <url> specify an url to start with. -x, --exclude <url> exclude an url from the scan (for example logout scripts) you can also use a wildcard (*): Example : -x "http://server/base/?page=*&module=test" or -x "http://server/base/admin/*" to exclude a directory -p, --proxy <url_proxy> specify a proxy (-p http://proxy:port/) -c, --cookie <cookie_file> use a cookie -t, --timeout <timeout> set the timeout (in seconds) -a, --auth <login%password> set credentials (for HTTP authentication) doesn't work with Python 2.4 -r, --remove <parameter_name> removes a parameter from URLs -m, --module <module> use a predefined set of scan/attack options: GET_ALL: only use GET request (no POST) GET_XSS: only XSS attacks with HTTP GET method POST_XSS: only XSS attacks with HTTP POST method -u, --underline use color to highlight vulnerable parameters in output -v, --verbose <level> set the verbosity level: 0: quiet (default), 1: print each url, 2: print every attack -h, --help print help page EFFICIENCY
Wapiti is developed in Python and use a library called lswww. This web spider library does the most of the work. Unfortunately, the html parsers module within python only works with well formed html pages so lswww fails to extract information from bad-coded webpages. Tidy can clean these webpages on the fly for us so lswww will give pretty good results. In order to make Wapiti far more efficient, you should: apt-get install python-utidylib python-ctypes AUTHOR
Copyright (C) 2006-2007 Nicolas Surribas <nicolas.surribas@gmail.com> Manpage created by Thomas Blasing <thomasbl@pool.math.tu-berlin.de> http://wapiti.sourceforge.net/ July 2007 WAPITI(1)
All times are GMT -4. The time now is 08:44 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy