Encapsulating output of CURL and/or WGET


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting Encapsulating output of CURL and/or WGET
# 1  
Old 09-27-2012
Encapsulating output of CURL and/or WGET

i use curl and wget quite often.

i set up alarms on their output. for instance, i would run a "wget" on a url and then search for certain strings within the output given by the "wget".

the problem is, i cant get the entire output or response of my wget/curl command to show up correctly in the Nagios GUI.

this is because there are a variety of characters that confuses the GUI. these characters include but aren't limited to: /\[]^& etc. i guess the term for them is "metacharacters".

how can I force it so that ALL ouput produced by a wget or curl is actually shown, down to the very last character?

i.e. for example, the below is the most basic example i can show as to what i want:

Code:
OUTPUT=$(wget www.cnn.com)

echo ${OUTPUT}

I would like to be able to show everything that is outputted by the wget, in the format that it was outputted.

this is harder than it sounds.
# 2  
Old 09-27-2012
How about just letting it print instead of continually cramming everything into backticks? You don't have to always have to do that.

Also, how about checking their return codes instead of grepping their output? Every process you create returns an error code for success or failure, and wget is not an exception. If it fails to download a page, it willl tell you so directly. There's no need to grep for 'error' in the output.

If you're downloading multiple pages and wish to see which succeeded via script, wget has the -nv option, which outputs success or failure for individual files in a simple line-by-line list.

The real problem is, they're not one thing, they're two streams. stdout is used for data, stderr is used for errors.

If you want them both to go to stdout: wget ... 2>&1
# 3  
Old 09-27-2012
Quote:
Originally Posted by Corona688
How about just letting it print instead of continually cramming everything into backticks? You don't have to always have to do that.

Also, how about checking their return codes instead of grepping their output? Every process you create returns an error code for success or failure, and wget is not an exception. If it fails to download a page, it willl tell you so directly. There's no need to grep for 'error' in the output.

If you're downloading multiple pages and wish to see which succeeded via script, wget has the -nv option, which outputs success or failure for individual files in a simple line-by-line list.

The real problem is, they're not one thing, they're two streams. stdout is used for data, stderr is used for errors.

If you want them both to go to stdout: wget ... 2>&1
thank you for the response.

the task i often have to deal with is, some clients request that a specific URL be curled/wgetted and in the output returned, they want to make sure that there is a certain string(s) contained in it.

so in the previous wget example i gave (its just an example), some clients may want to make sure the words "HTTP request sent, awaiting response" is found in the output. in such a case, just checking for the exit code is not good enough. it helps if, when the check alerts, it also contains the full error message.

lets say the exit code is a non-zero. it would help tremendously to actually be able to see what the failure was.

for instance, after running a wget on a URL, the following was returned:

Code:
<detail><vXMLVersion>0.0.1</vXMLVersion><addressAndMailPieceInformation><vError>Fatal exception during validationjava.lang.NullPointerException</vError></addressAndMailPieceInformation></detail>


Now, this isn't the typical response. So when the URL check was run and got back this error response, it would help if I can actually show the entire message to the output of the GUI. its very difficult putting this response in a variable and outputting it out the way it alerted.

see what i mean?
# 4  
Old 09-27-2012
Well, you could save the stderr output in a temp file with 2>filename and spit it out only if the data you get doesn't meet certain requirements.
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Web Development

Wget/curl and javascript

What can I use instead of wget/curl when I need to log into websites that use javascript? Wget and curl don't handle javascript. (6 Replies)
Discussion started by: locoroco
6 Replies

2. Shell Programming and Scripting

Wget and curl to post data

i'm using this command to post data to a remote host: wget --post-data="My Data" http://<my-ip>:80 -O /dev/null -q and curl --data "My Data" http://<my-ip>:80 however, when i run the above, i see the following in my access log on the remote host: Wget: 10.10.10.10 - - "POST /... (1 Reply)
Discussion started by: SkySmart
1 Replies

3. Shell Programming and Scripting

How to get content of a webpage Curl vs Wget?

Hello, What I am trying to do is to get html data of a website automatically. Firstly I decided to do it manually and via terminal I entered below code: $ wget http://www.***.*** -q -O code.html Unfortunately code.html file was empty. When I enter below code it gave Error 303-304 $... (1 Reply)
Discussion started by: baris35
1 Replies

4. Shell Programming and Scripting

Wget/curl credentials validation

Experts, I login to a 3rd party and pull some valuable information with my credentials. I pass my credentials via --post-data in wget. Now my Account is locked. I want my wget to alert that the Account is locked. How can i achieve this. My idea is, get the Source page html from the... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies

5. Shell Programming and Scripting

Wget vs Curl - Proxy issue

Hi, My script needs to crawl the data from a third party site. Currently it is written in wget. The third party site is of shared interface with different IP addresses. My wget works with all the IP address but not with one. Whereas the curl is able to hit that IP address and comes out... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies

6. Shell Programming and Scripting

How to download file without curl and wget

Hi I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget. Can anyone will help me go about this task ??? Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies

7. Shell Programming and Scripting

Specifying IP address with curl/wget

Hello, I am wondering does anyone know of a method using curl/wget or other where by I could specify the IP address of the server I wish to query for a website. Something similar to editing /etc/hosts but that can be done directly from the command line. I have looked through the man pages... (4 Replies)
Discussion started by: colinireland
4 Replies

8. Shell Programming and Scripting

ery weird wget/curl output - what should I do?

Hi, I'm trying to write a script to download RedHat's errata digest. It comes in a txt.gz format, and i can get it easily with firefox. HOWEVER: output is VERY strange when donwloading it in a script. It seems I'm getting a file of the same size - but partially text and partly binary! It... (5 Replies)
Discussion started by: jstilby
5 Replies

9. Shell Programming and Scripting

Proxy with curl/wget support

I need a proxy that would enable me to use cli curl/wget with another ip address. How do I find a paid proxy server that supports curl/wget? (1 Reply)
Discussion started by: locoroco
1 Replies

10. Shell Programming and Scripting

Help needed in Curl & Wget

We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work. curl -k https://site curl -k -x IP:Port https://site curl -k -x IP:443 https://id:pwd@site ... (0 Replies)
Discussion started by: dineshbabu01
0 Replies
Login or Register to Ask a Question