cat check_URL.ksh
URL=http://chu.com/
STATUS_CHECK=`wget --spider -S $URL 2>&1 | grep "200 OK" | awk '{print $2}'`
if [[ -n $STATUS_CHECK ]]
then
echo $URL is up and fine.
else
echo $URL is DOWN.
fi
Hy all,
(sorry in advance for my bad english)
i have a problem with a web application who seems to "freeze", and i want to make a little unix script for checking the application.
Does anyone know a command to test an url ??? the application is on a server where i can not install... (2 Replies)
Hello,
I need to redirect an existing URL, how can i do that?
There's a current web address to a GUI that I have to redirect to another webaddress. Does anyone know how to do this?
This is on Unix boxes Linux.
example:
https://m45.testing.address.net/host.php
make it so the... (3 Replies)
This is the code:
while test 1 -eq 1
do
read a
$a
if test $a = stop
then
break
fi
done
I read a command on every loop an execute it.
I check if the string equals the word stop to end the loop,but it say that I gave too many arguments to test.
For example echo hello.
Now the... (1 Reply)
I am trying to find a way to test some code, but I need to rewrite a specific URL only from a specific HTTP_HOST
The call goes out to
http://SUB.DOMAIN.COM/showAssignment/7bde10b45efdd7a97629ef2fe01f7303/jsmodule/Nevow.Athena
The ID in the middle is always random due to the cookie.
I... (5 Replies)
Here is what I have so far:
find . -name "*php*" -or -name "*htm*" | xargs grep -i iframe | awk -F'"' '/<iframe*/{gsub(/.\*iframe>/,"\"");print $2}'
Here is an example content of a PHP or HTM(HTML) file:
<iframe src="http://ADDRESS_1/?click=5BBB08\" width=1 height=1... (18 Replies)
Hi,
I have a problem where i have to hit multiple URL that are stored in a text file (input.txt) and save their output in different text file (output.txt) somewhat like :
cat input.txt
http://192.168.21.20:8080/PPUPS/international?NUmber=917875446856... (3 Replies)
Hello,
Am very new to perl , please help me here !!
I need help in reading a URL from command line using PERL:: Mechanize and needs all the contents from the URL to get into a file.
below is the script which i have written so far ,
#!/usr/bin/perl
use LWP::UserAgent;
use... (2 Replies)
Discussion started by: scott_cog
2 Replies
LEARN ABOUT SUSE
wmgrabimage
WMGRABIMGAE(1) General Commands Manual WMGRABIMGAE(1)NAME
WMGRABIMGAE - Dockable WWW Image monitor.
SYNOPSIS
wmGrabImage [-h] [-display <Display>] -url <Image URL> [-http <URL>] [-c] [-delay <Time>]
DESCRIPTION
wmGrabImage is a WindowMaker DockApp that maintains a small thumbnail copy of your favorite image from the WWW. The image to monitor is
specified via the "-url <Image URL>" command-line option and it gets updated approximately every 5 minutes. The update interval can be
overridden via the "-delay <Time>" command-line option (Time is in seconds).
Each of the three mouse buttons can be double clicked with the following effects;
Left Mouse:
Brings up the full-sized image in xv.
Middle Mouse:
Sends a URL (specified via the -http <URL> command-line option) to an already running netscape process or in a new netscape process
if there arent any running.
Right Mouse:
Updates the image immediately.
OPTIONS -h Display list of command-line options.
-display [display]
Use an alternate X Display.
-url <Image URL>
The URL of the WWW image to monitor.
-http <URL>
The URL to send to netscape via a Middle double click.
-c Center the image vertically within the icon.
-delay <Time>
The time between updates. The default is about 5 minutes.
FILES
The original sized image and the thumbnail XPM image are both stored in ~/.wmGrabImage/ which gets created if it doesnt already exist.
SEE ALSO
wget and the ImageMagick convert utility.
BUGS
Who knows? -- its still Beta though. (Let me know if you find any). Oldish versions of the ImageMagick convert utility have a memory leak
-- if you have that problem, upgrade to the latest version.
AUTHOR
Michael G. Henderson <mghenderson@lanl.gov>
16 December 1998 WMGRABIMGAE(1)