How to get the page size (of a url) using wget


 
Thread Tools Search this Thread
Top Forums Shell Programming and Scripting How to get the page size (of a url) using wget
# 1  
Old 12-18-2006
How to get the page size (of a url) using wget

Hi ,
I am trying to get page size of a url(e.g.,www.example.com) using wget command.Any thoughts what are the parameters i need to send with wget to get the size alone?

Regards,
Raj
# 2  
Old 12-18-2006
wget --spider --server-response [url] will print any headers the server returns without downloading the page proper. Unfortunately, many pages, particularly dynamically generated ones, won't report their size, just report "Length: unspecified [text/html]".
Login or Register to Ask a Question

Previous Thread | Next Thread

10 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

Wget fails for a valid URL

Wget Error Codes: 0 No problems occurred. 1 Generic error code. 2 Parse error—for instance, when parsing command-line options, the .wgetrc or .netrc… 3 File I/O error. 4 Network failure. 5 SSL verification failure. 6 Username/password authentication failure. ... (3 Replies)
Discussion started by: mohtashims
3 Replies

2. Post Here to Contact Site Administrators and Moderators

Page Not Found error while parsing url

Hi I just tried to post following link while answering, its not parsing properly, just try on your browser Tried to paste while answering : https://www.unix.com/302873559-post2.htmlNot operator is not coming with HTML/PHP tags so attaching file (2 Replies)
Discussion started by: Akshay Hegde
2 Replies

3. Shell Programming and Scripting

Read URL data from UNIX-CLI without Wget,CURL,w3m,LWP

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts.My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled it.... (12 Replies)
Discussion started by: scott_cog
12 Replies

4. UNIX for Dummies Questions & Answers

Read URL data from UNIX without wget,curl,lynx,w3m.

Hi Experts, Problem statement : We have an URL for which we need to read the data and get parsed inside the shell scripts. My Aix has very limited perl utility, i cant install any utility as well. Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled... (0 Replies)
Discussion started by: scott_cog
0 Replies

5. Shell Programming and Scripting

Wget and single page

Good evening to all!! I'm trying to become familiar with wget. I would like to download a page from Wikipedia with all images and CSSs but without going down to all links present in the page. It should be named index.html. I would like also to save it to /mnt/us inside a new folder. This is... (5 Replies)
Discussion started by: silver18
5 Replies

6. UNIX for Dummies Questions & Answers

Launch a URL,validate username and password using wget or curl

Hi All, I want to launch "ex: http://gmail.com" from the cmd window and validate the credentials with username and password, is it possible? I have found something like this "wget --http-user=USER' --http-password=PASSWORD http://gmail.com" am new to this and unable to find a solution, i... (0 Replies)
Discussion started by: harsha85
0 Replies

7. Shell Programming and Scripting

How to extract url from html page?

for example, I have an html file, contain <a href="http://awebsite" id="awebsite" class="first">website</a>and sometime a line contains more then one link, for example <a href="http://awebsite" id="awebsite" class="first">website</a><a href="http://bwebsite" id="bwebsite"... (36 Replies)
Discussion started by: 14th
36 Replies

8. Shell Programming and Scripting

WGET cycling on an updating page

Hello, I am experiencing an issue while downloading a few pages using wget. All of them work without a problem except one which is a page that does a tail on the log and as a result is constantly getting updated. wget here seems to run endlessly and needs to be manually killed. I wanted to... (0 Replies)
Discussion started by: prafulnama
0 Replies

9. UNIX for Dummies Questions & Answers

wget with semicolon in page name

So, I'd like to wget a webpage, as its not going to stick around forever - but the problem is the webpage has a semicolon in it. wget http://example.com/stuff/asdf;asdf obviously doesn't get the right webpage. Any good way around this? (2 Replies)
Discussion started by: Julolidine
2 Replies

10. Shell Programming and Scripting

wget to check an URL

I all, I wrote an script which starts a Weblogic server and waits until its loaded to deploy several apps. The way I checked was something like: while ; do wget --spider <URL>:<port>/console > /dev/null 2>&1 rc=$? done This works perfectly because it's an HTML site and when server is... (2 Replies)
Discussion started by: AlbertGM
2 Replies
Login or Register to Ask a Question