Hello,
What I am trying to do is to get html data of a website automatically.
Firstly I decided to do it manually and via terminal I entered below code:
Unfortunately code.html file was empty.
When I enter below code it gave Error 303-304
When I try below command, I see what I want to achieve but this is only streaming data on terminal window that can't be copied:
Could you please let me know is there a way to save the data with Curl and save it into a file ?
We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work.
curl -k https://site
curl -k -x IP:Port https://site
curl -k -x IP:443 https://id:pwd@site
... (0 Replies)
Hello,
I am wondering does anyone know of a method using curl/wget or other where by I could specify the IP address of the server I wish to query for a website.
Something similar to editing /etc/hosts but that can be done directly from the command line. I have looked through the man pages... (4 Replies)
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
i use curl and wget quite often.
i set up alarms on their output. for instance, i would run a "wget" on a url and then search for certain strings within the output given by the "wget".
the problem is, i cant get the entire output or response of my wget/curl command to show up correctly in... (3 Replies)
Hi,
My script needs to crawl the data from a third party site. Currently it is written in wget. The third party site is of shared interface with different IP addresses.
My wget works with all the IP address but not with one. Whereas the curl is able to hit that IP address and comes out... (2 Replies)
Experts,
I login to a 3rd party and pull some valuable information with my credentials. I pass my credentials via --post-data in wget.
Now my Account is locked. I want my wget to alert that the Account is locked. How can i achieve this.
My idea is, get the Source page html from the... (2 Replies)
Hello,
when im downloading an webpage from command line (CLI) by curl or wget the target website is loaded like i load it from browser? meaning target server connect to database and render data from mysql? Or only static content is downloaded? (2 Replies)
The html page of the form data is as below
<form name="uploadform" id="uploadform" action="htmlupload.php" enctype="multipart/form-data" method="post"> <table class="tborder" cellpadding="6" cellspacing="1" border="0" width="100%" align="center"> <tr> <td class="tcat"> Upload Files ... (0 Replies)
What can I use instead of wget/curl when I need to log into websites that use javascript?
Wget and curl don't handle javascript. (6 Replies)
Discussion started by: locoroco
6 Replies
LEARN ABOUT DEBIAN
hobbit-statusreport.cgi
HOBBIT-STATUSREPORT.CGI(1) General Commands Manual HOBBIT-STATUSREPORT.CGI(1)NAME
hobbit-statusreport.cgi - CGI program to report a status for a group of servers
SYNOPSIS
hobbit-statusreport.cgi --column=COLUMNNAME [options]
DESCRIPTION
hobbit-statusreport.cgi is a CGI tool to generate a simple HTML report showing the current status of a single column for a group of Xymon
hosts.
E.g. You can use this report to get an overview of all of the SSL certificates that are about to expire.
The generated webpage is a simple HTML table, suitable for copying into other documents or e-mail.
hobbit-statusreport.cgi runs as a CGI program, invoked by your webserver. It is normally run via a wrapper shell-script in the CGI direc-
tory for Xymon.
EXAMPLES
The Xymon installation includes two web report scripts using this CGI tool: The hobbit-certreport.sh script generates a list of SSL server
certificates that are yellow or red (i.e. they will expire soon); and the hobbit-nongreen.sh script generates a report of all statuses that
are currently non-green. These can be accessed from a web browser through a URL referencing the script in the Xymon CGI directory (e.g.
"/xymon-cgi/xymon-nongreen.sh").
OPTIONS --column=COLUMNNAME
Report the status of the COLUMNNAME column.
--all Report the status for all hosts known to Xymon. By default, this tool reports only on the hosts found on the current page from where
the CGI was invoked (by looking at the "pagepath" cookie).
--filter=CRITERIA
Only report on statuses that match the CRITERIA setting. See the bb(1) man-page - in the "hobbitdboard" command description - for
details about specifying filters.
--heading=HTML
Defines the webpage heading - i.e. the "title" tag in the generated HTML code.
--show-column
Include the column name in the display.
--show-colors
Show the status color on the generated webpage. The default is to not show the status color.
--no-colors
Do not include text showing the current color of each status in the report. This is the default.
--show-summary
Show only a summary of the important lines in the status message. By default, the entire status message appears in the generated
HTML code. This option causes the first non-blank line of the status message to be shown, and also any lines beginning with "&COLOR"
which is used by many status messages to point out lines of interest (non-green lines only, though).
--show-message
Show the entire message on the webpage. This is the default.
--link Include HTML links to the host "info" page, and the status page.
--embedded
Only generate the HTML table, not a full webpage. This can be used to embed the status report into an external webpage.
--env=FILENAME
Load the environment from FILENAME before executing the CGI.
--area=NAME
Load environment variables for a specific area. NB: if used, this option must appear before any --env=FILENAME option.
SEE ALSO xymon(7)Xymon Version 4.2.3: 4 Feb 2009 HOBBIT-STATUSREPORT.CGI(1)