Sponsored Content
Top Forums Shell Programming and Scripting Specifying IP address with curl/wget Post 302577055 by balajesuri on Monday 28th of November 2011 05:57:08 AM
Old 11-28-2011
Code:
wget -o /dev/null -O output.html "www.unix.com"

output.dat file will contain the HTML of www<dot>unix<dot>com. You can also specify IP address instead of the URL. Is this what you're looking for?
 

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Help needed in Curl & Wget

We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work. curl -k https://site curl -k -x IP:Port https://site curl -k -x IP:443 https://id:pwd@site ... (0 Replies)
Discussion started by: dineshbabu01
0 Replies

2. Shell Programming and Scripting

Proxy with curl/wget support

I need a proxy that would enable me to use cli curl/wget with another ip address. How do I find a paid proxy server that supports curl/wget? (1 Reply)
Discussion started by: locoroco
1 Replies

3. Shell Programming and Scripting

proxy server with wget or cli curl

I'm using a proxy service with an ip address and a port number. How do I use the proxy with wget or cli curl? (1 Reply)
Discussion started by: locoroco
1 Replies

4. Shell Programming and Scripting

How to download file without curl and wget

Hi I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget. Can anyone will help me go about this task ??? Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies

5. Shell Programming and Scripting

Encapsulating output of CURL and/or WGET

i use curl and wget quite often. i set up alarms on their output. for instance, i would run a "wget" on a url and then search for certain strings within the output given by the "wget". the problem is, i cant get the entire output or response of my wget/curl command to show up correctly in... (3 Replies)
Discussion started by: SkySmart
3 Replies

6. Shell Programming and Scripting

Wget vs Curl - Proxy issue

Hi, My script needs to crawl the data from a third party site. Currently it is written in wget. The third party site is of shared interface with different IP addresses. My wget works with all the IP address but not with one. Whereas the curl is able to hit that IP address and comes out... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies

7. Shell Programming and Scripting

Wget/curl credentials validation

Experts, I login to a 3rd party and pull some valuable information with my credentials. I pass my credentials via --post-data in wget. Now my Account is locked. I want my wget to alert that the Account is locked. How can i achieve this. My idea is, get the Source page html from the... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies

8. Shell Programming and Scripting

How to get content of a webpage Curl vs Wget?

Hello, What I am trying to do is to get html data of a website automatically. Firstly I decided to do it manually and via terminal I entered below code: $ wget http://www.***.*** -q -O code.html Unfortunately code.html file was empty. When I enter below code it gave Error 303-304 $... (1 Reply)
Discussion started by: baris35
1 Replies

9. Shell Programming and Scripting

Wget and curl to post data

i'm using this command to post data to a remote host: wget --post-data="My Data" http://<my-ip>:80 -O /dev/null -q and curl --data "My Data" http://<my-ip>:80 however, when i run the above, i see the following in my access log on the remote host: Wget: 10.10.10.10 - - "POST /... (1 Reply)
Discussion started by: SkySmart
1 Replies

10. Web Development

Wget/curl and javascript

What can I use instead of wget/curl when I need to log into websites that use javascript? Wget and curl don't handle javascript. (6 Replies)
Discussion started by: locoroco
6 Replies
HXINCL(1)							  HTML-XML-utils							 HXINCL(1)

NAME
hxincl - expand included HTML or XML files SYNOPSIS
hxincl [ -x ] [ -f ] [ -s name=subst ] [ -s name=subst ]... [ -b base ] [ file-or-URL ] DESCRIPTION
The hxincl command copies an HTML or XML file to standard output, looking for comments with a certain structure. Such a comment is replaced by the file whose name is given as the attribute of the directive. For example: ...<!-- include "foo.html" -->... will be replaced by the content of the file foo.html. It is important to note that you must quote filenames if they contain white space. The comment is replaced by <!-- begin-include "foo.html" --> before the included text and <!-- end-include "foo.html" --> after it. These comments make it possible to run hxincl on the resulting file again to update the inclusions. Single quotes are allowed instead of double quotes. And if the file name contains no spaces, the quotes may also be omitted. OPTIONS
The following options are supported: -x Use XML conventions: empty elements are written with a slash at the end: <IMG />. -b base Sets the base URL for resolving relative URLs. By default the file given as argument is the base URL. -f Removes the comments after including the files. This means hxincl connot be run on the resulting file later to update the inclu- sions. (Mnemonic: final or frozen.) -s name=substitution Include a different file than the one mentioned in the directive. If the comment is <!-- include "name" --> the file substitution is included instead. The option -s may occur multiple times. OPERANDS
The following operand is supported: file-or-URL The name of an HTML or XML file or the URL of one. If absent, standard input is read instead. EXIT STATUS
The following exit values are returned: 0 Successful completion. > 0 An error occurred in the parsing of one of the HTML or XML files. ENVIRONMENT
To use a proxy to retrieve remote files, set the environment variables http_proxy or ftp_proxy. E.g., http_proxy="http://localhost:8080/" BUGS
Assumes UTF-8 as input. Doesn't expand character entities. Instead pipe the input through hxunent(1) and asc2xml(1) to convert it to UTF-8. Remote files (specified with a URL) are currently only supported for HTTP. Password-protected files or files that depend on HTTP "cookies" are not handled. (You can use tools such as curl(1) or wget(1) to retrieve such files.) SEE ALSO
asc2xml(1), hxnormalize(1), hxnum(1), hxprune(1), hxtoc(1), hxunent(1), xml2asc(1), UTF-8 (RFC 2279) 6.x 10 Jul 2011 HXINCL(1)
All times are GMT -4. The time now is 12:53 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy