Sponsored Content
Full Discussion: curl command with web pages
Top Forums UNIX for Dummies Questions & Answers curl command with web pages Post 302322711 by spirtle on Thursday 4th of June 2009 11:31:32 AM
Old 06-04-2009
By default, curl writes to standard output, so to get it to write to a file you can either use the usual Unix file redirection
Code:
curl url > file

or use the -o or --output option
Code:
curl url -o file

The HTML file at the particular URL you have there only has a few lines (a 301 notice) anyway, so are you sure you got just the first few lines and not all of them?
 

9 More Discussions You Might Find Interesting

1. UNIX for Dummies Questions & Answers

Dynamic web pages for Unix Web Server

Hi, my company is considering a new development of our web site, which used to run on Apachi over Solaris. The company who is going to do this for us knows only about developing it in ASP. I guess this means we'll have to have another ISS server on NT for these dynamic pages :( What are... (5 Replies)
Discussion started by: me2unix
5 Replies

2. Shell Programming and Scripting

Count links in all of my web pages

Counts the number of hyperlinks in all web pages in the current directory and all of its sub-directories. Count in all files of type "*htm" and "*html" . i want the output to look something like this: Total number of web pages: (number) Total number of links: (number) Average number of links... (1 Reply)
Discussion started by: phillip
1 Replies

3. UNIX for Dummies Questions & Answers

Selecting information from several web pages...

Hi All! Is this possible? I know of several hundreds of urls linking to similar looking hp-ux man pages, like these. In these urls only the last words separated by / are changing in numbering, so we can generate these... http://docs.hp.com/hpux/onlinedocs/B3921-90010/00/00/31-con.html... (2 Replies)
Discussion started by: Vishnu
2 Replies

4. UNIX for Dummies Questions & Answers

Browse Web pages through command line

Is there any way to browse web pages while on the command line? I know wget can download pages, but I was wondering if there was an option other than that. (2 Replies)
Discussion started by: vroomlicious
2 Replies

5. Shell Programming and Scripting

Investigating web pages in awk

hello. i want to make an awk script to search an html file and output all the links (e.g .html, .htm, .jpg, .doc, .pdf, etc..) inside it. also, i want the links that will be output to be split into 3 groups (separated by an empty line), the first group with links to other webpages (.html .htm etc),... (1 Reply)
Discussion started by: adpe
1 Replies

6. UNIX for Dummies Questions & Answers

Forcing web pages to anti-aliase

Here is an observation that has started to riddle me and perhaps someone can enlighten me. When a web page (or desktop page for that matter) uses the standard font, it is not anti-aliased, unless the user opts in to do so via the desktop settings. It appears however that fonts are not... (0 Replies)
Discussion started by: figaro
0 Replies

7. Shell Programming and Scripting

Checking Web Pages?

Hey guys, Unfortunatley, I can not use wget on our systems.... I am looking for another way for a UNIX script to test web pages and let me know if they are up or down for some of our application. Has anyone saw this before? Thanks, Ryan (2 Replies)
Discussion started by: rwcolb90
2 Replies

8. Shell Programming and Scripting

Get web pages and compare

Hello, I'm writing a shell script to wget content web pages from multiple server into a variable and compare if they match return 0 or return 2 #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv1 srv2 srv3 srv4" CLUSTER1_APPLIS="test/version.html test2.version.jsp" # Liste des... (4 Replies)
Discussion started by: gtam
4 Replies

9. Shell Programming and Scripting

Get web pages and compare

Hello I'm writing a script to get content of web pages on different machines and compare them using their md5 hash hear is my code #!/bin/bash # Cluster 1 CLUSTER1_SERVERS="srv01:7051 srv02:7052 srv03:7053 srv04:7054" CLUSTER1_APPLIS="test/version.html test2/version.html... (2 Replies)
Discussion started by: gtam
2 Replies
CURLOPT_CONNECTTIMEOUT_MS(3)				     curl_easy_setopt options				      CURLOPT_CONNECTTIMEOUT_MS(3)

NAME
CURLOPT_CONNECTTIMEOUT_MS - timeout for the connect phase SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_setopt(CURL *handle, CURLOPT_CONNECTTIMEOUT_MS, long timeout); DESCRIPTION
Pass a long. It should contain the maximum time in milliseconds that you allow the connection phase to the server to take. This only lim- its the connection phase, it has no impact once it has connected. Set to zero to switch to the default built-in connection timeout - 300 seconds. See also the CURLOPT_TIMEOUT_MS(3) option. In unix-like systems, this might cause signals to be used unless CURLOPT_NOSIGNAL(3) is set. DEFAULT
300000 PROTOCOLS
All EXAMPLE
CURL *curl = curl_easy_init(); if(curl) { curl_easy_setopt(curl, CURLOPT_URL, "http://example.com"); /* complete connection within 10000 milliseconds */ curl_easy_setopt(curl, CURLOPT_CONNECTTIMEOUT_MS, 10000L); curl_easy_perform(curl); } AVAILABILITY
Always RETURN VALUE
Returns CURLE_OK SEE ALSO
CURLOPT_TIMEOUT(3), CURLOPT_LOW_SPEED_LIMIT(3), libcurl 7.54.0 February 14, 2016 CURLOPT_CONNECTTIMEOUT_MS(3)
All times are GMT -4. The time now is 02:16 PM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy