Hello,
I am new to unix, but wanted to know how can we fetch data from a web page (i.e. an HTML Page), my requirement is to read an html page and wanted to create a flat file (text file) based on the contents available in the mentioned HTML page.
Thanks
Imtiaz (3 Replies)
Hello,
I'm a total newbie to HTTP commands, so I'm not sure how to do this. What I'd like is to write a C program to fetch the contents of a html page of a given address.
Could someone help with this?
Thanks in advance! (4 Replies)
I have a shell script that runs periodic upgrades on machines. I want to print certain echo commands from the shell script onto a webpage. What command in shell should I use for doing this. (1 Reply)
wget --spider --user=xxxx --password=xxxx "http://xxx.xxxx.com" > /dev/null 2>&1;
I am using the above command in if loop to check the response of the page without downloading the page. I just want to check whether the page is up and running. But when i execute the command i am getting
HTTP... (2 Replies)
Hi,
I have a SCO Unix Openserver V6 server which is hosting a website with Apache V1.3 as the http server. The web site has an initial login screen which re-directs to another page once the user name and password has been verified.
When connecting to the website and trying to login, it times... (0 Replies)
hi
i was trying to run the HTML script and was unable to run it as the apache server was not loaded on my linux server.....how do i check whether A[pache has been installed in my server or not.....???? (1 Reply)
Hi everyone!
How can I get response time difference between GET and HTTP/1.0 200 OK (i mean time latency of web-server) with using of tshark&shell or something else for each hostname from pcap file?
What can you recommend me to do that? (1 Reply)
Hi Guys,
Is there any way that we can know whether a website is fullly loaded with Linux command line ?? is there any command in Linux that can achieve that ??
Also,naturally I would also like to get the response code of the particular website/URL that i am testing for ??
Any help would be... (3 Replies)
Discussion started by: Pradeep_1990
3 Replies
LEARN ABOUT DEBIAN
yaz-url
YAZ-URL(1) Commands YAZ-URL(1)NAME
yaz-url - YAZ URL fetch utility
SYNOPSIS
yaz-url [-H name:value] [-m method] [-O fname] [-p fname] [-u user/password] [-x proxy] [url...]
DESCRIPTION
yaz-url is utility to get web content. It is very limited in functionality compared to programs such as curl, wget.
The options must be precede the URL given on the command line to take effect.
Fetched HTTP content is written to stdout, unless option -O is given.
OPTIONS -H name:value
Specifies HTTP header content with name and value. This option can be given multiple times (for different names, of course).
-m method
Specifies the HTTP method to be used for the next URL. Default is method "GET". However, option -p sets it to "POST".
-O fname
Sets output filename for HTTP content.
-p fname
Sets a file to be POSTed in the folloing URL.
-u user/password
Specifies a user and a password to be uesd in HTTP basic authentication in the following URL fetch. The user and password must be
separated by a slash (this it is not possible to specify a user with a slash in it).
-x proxy
Specifies a proxy to be used for URL fetch.
SEE ALSO yaz(7)YAZ 4.2.30 04/16/2012 YAZ-URL(1)