12-18-2006
How to get the page size (of a url) using wget
Hi ,
I am trying to get page size of a url(e.g.,
www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone?
Regards,
Raj
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
I all,
I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like:
while ; do
wget --spider <URL>:<port>/console > /dev/null 2>&1
rc=$?
done
This works perfectly because it's an HTML site and when server is... (2 Replies)
Discussion started by: AlbertGM
2 Replies
2. UNIX for Dummies Questions & Answers
So, I'd like to wget a webpage, as its not going to stick around forever - but the problem is the webpage has a semicolon in it.
wget http://example.com/stuff/asdf;asdf obviously doesn't get the right webpage.
Any good way around this? (2 Replies)
Discussion started by: Julolidine
2 Replies
3. Shell Programming and Scripting
Hello,
I am experiencing an issue while downloading a few pages using wget. All of them work without a problem except one which is a page that does a tail on the log and as a result is constantly getting updated.
wget here seems to run endlessly and needs to be manually killed. I wanted to... (0 Replies)
Discussion started by: prafulnama
0 Replies
4. Shell Programming and Scripting
for example, I have an html file, contain
<a href="http://awebsite" id="awebsite" class="first">website</a>and sometime a line contains more then one link, for example
<a href="http://awebsite" id="awebsite" class="first">website</a><a href="http://bwebsite" id="bwebsite"... (36 Replies)
Discussion started by: 14th
36 Replies
5. UNIX for Dummies Questions & Answers
Hi All,
I want to launch "ex: http://gmail.com" from the cmd window and validate the credentials with username and password, is it possible?
I have found something like this
"wget --http-user=USER' --http-password=PASSWORD http://gmail.com" am new to this and unable to find a solution, i... (0 Replies)
Discussion started by: harsha85
0 Replies
6. Shell Programming and Scripting
Good evening to all!!
I'm trying to become familiar with wget.
I would like to download a page from Wikipedia with all images and CSSs but without going down to all links present in the page. It should be named index.html.
I would like also to save it to /mnt/us inside a new folder.
This is... (5 Replies)
Discussion started by: silver18
5 Replies
7. UNIX for Dummies Questions & Answers
Hi Experts,
Problem statement :
We have an URL for which we need to read the data and get parsed inside the shell scripts.
My Aix has very limited perl utility, i cant install any utility as well.
Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled... (0 Replies)
Discussion started by: scott_cog
0 Replies
8. Shell Programming and Scripting
Hi Experts,
Problem statement :
We have an URL for which we need to read the data and get parsed inside the shell scripts.My Aix has very limited perl utility, i cant install any utility as well.
Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled it.... (12 Replies)
Discussion started by: scott_cog
12 Replies
9. Post Here to Contact Site Administrators and Moderators
Hi
I just tried to post following link while answering, its not parsing properly, just try on your browser
Tried to paste while answering :
https://www.unix.com/302873559-post2.htmlNot operator is not coming with HTML/PHP tags so attaching file (2 Replies)
Discussion started by: Akshay Hegde
2 Replies
10. Shell Programming and Scripting
Wget Error Codes:
0 No problems occurred.
1 Generic error code.
2 Parse error—for instance, when parsing command-line options, the .wgetrc or .netrc…
3 File I/O error.
4 Network failure.
5 SSL verification failure.
6 Username/password authentication failure.
... (3 Replies)
Discussion started by: mohtashims
3 Replies
LEARN ABOUT DEBIAN
yaz-url
YAZ-URL(1) Commands YAZ-URL(1)
NAME
yaz-url - YAZ URL fetch utility
SYNOPSIS
yaz-url [-H name:value] [-m method] [-O fname] [-p fname] [-u user/password] [-x proxy] [url...]
DESCRIPTION
yaz-url is utility to get web content. It is very limited in functionality compared to programs such as curl, wget.
The options must be precede the URL given on the command line to take effect.
Fetched HTTP content is written to stdout, unless option -O is given.
OPTIONS
-H name:value
Specifies HTTP header content with name and value. This option can be given multiple times (for different names, of course).
-m method
Specifies the HTTP method to be used for the next URL. Default is method "GET". However, option -p sets it to "POST".
-O fname
Sets output filename for HTTP content.
-p fname
Sets a file to be POSTed in the folloing URL.
-u user/password
Specifies a user and a password to be uesd in HTTP basic authentication in the following URL fetch. The user and password must be
separated by a slash (this it is not possible to specify a user with a slash in it).
-x proxy
Specifies a proxy to be used for URL fetch.
SEE ALSO
yaz(7)
YAZ 4.2.30 04/16/2012 YAZ-URL(1)