01-29-2011
Thank you. I'm surprised I missed that. I may as well ask one more question -
In terms of grabbing webpages -
is there a faster way to grab webpages or is this mostly limited by the speed of the internet connection? I was playing around with wget, lynx --dump and curl and they're all at 0.5 seconds at the fastest. curl is the slowest.
10 More Discussions You Might Find Interesting
1. Shell Programming and Scripting
We are trying to invoke a https service from our unix script using curl command. The service is not getting invoked because it is SSL configured. Bypassing certification (using curl –k) does not work.
curl -k https://site
curl -k -x IP:Port https://site
curl -k -x IP:443 https://id:pwd@site
... (0 Replies)
Discussion started by: dineshbabu01
0 Replies
2. Shell Programming and Scripting
I need a proxy that would enable me to use cli curl/wget with another ip address.
How do I find a paid proxy server that supports curl/wget? (1 Reply)
Discussion started by: locoroco
1 Replies
3. Shell Programming and Scripting
Hello,
I am wondering does anyone know of a method using curl/wget or other where by I could specify the IP address of the server I wish to query for a website.
Something similar to editing /etc/hosts but that can be done directly from the command line. I have looked through the man pages... (4 Replies)
Discussion started by: colinireland
4 Replies
4. Shell Programming and Scripting
Hi
I need a Shell script that will download a zip file every second from a http server but i can't use neither curl nor wget.
Can anyone will help me go about this task ???
Thanks!! (1 Reply)
Discussion started by: rubber08
1 Replies
5. Shell Programming and Scripting
i use curl and wget quite often.
i set up alarms on their output. for instance, i would run a "wget" on a url and then search for certain strings within the output given by the "wget".
the problem is, i cant get the entire output or response of my wget/curl command to show up correctly in... (3 Replies)
Discussion started by: SkySmart
3 Replies
6. Shell Programming and Scripting
Hi,
My script needs to crawl the data from a third party site. Currently it is written in wget. The third party site is of shared interface with different IP addresses.
My wget works with all the IP address but not with one. Whereas the curl is able to hit that IP address and comes out... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies
7. Shell Programming and Scripting
Experts,
I login to a 3rd party and pull some valuable information with my credentials. I pass my credentials via --post-data in wget.
Now my Account is locked. I want my wget to alert that the Account is locked. How can i achieve this.
My idea is, get the Source page html from the... (2 Replies)
Discussion started by: sathyaonnuix
2 Replies
8. UNIX for Dummies Questions & Answers
Hi Experts,
Problem statement :
We have an URL for which we need to read the data and get parsed inside the shell scripts.
My Aix has very limited perl utility, i cant install any utility as well.
Precisely, wget,cURL,Lynx,w3m and Lwp cant be used as i got these utilities only when i googled... (0 Replies)
Discussion started by: scott_cog
0 Replies
9. Shell Programming and Scripting
Hello,
What I am trying to do is to get html data of a website automatically.
Firstly I decided to do it manually and via terminal I entered below code:
$ wget http://www.***.*** -q -O code.html
Unfortunately code.html file was empty.
When I enter below code it gave Error 303-304
$... (1 Reply)
Discussion started by: baris35
1 Replies
10. Web Development
What can I use instead of wget/curl when I need to log into websites that use javascript?
Wget and curl don't handle javascript. (6 Replies)
Discussion started by: locoroco
6 Replies
LEARN ABOUT MOJAVE
curlopt_timeout
CURLOPT_TIMEOUT(3) curl_easy_setopt options CURLOPT_TIMEOUT(3)
NAME
CURLOPT_TIMEOUT - set maximum time the request is allowed to take
SYNOPSIS
#include <curl/curl.h>
CURLcode curl_easy_setopt(CURL *handle, CURLOPT_TIMEOUT, long timeout);
DESCRIPTION
Pass a long as parameter containing timeout - the maximum time in seconds that you allow the libcurl transfer operation to take. Normally,
name lookups can take a considerable time and limiting operations to less than a few minutes risk aborting perfectly normal operations.
This option may cause libcurl to use the SIGALRM signal to timeout system calls.
In unix-like systems, this might cause signals to be used unless CURLOPT_NOSIGNAL(3) is set.
If both CURLOPT_TIMEOUT(3) and CURLOPT_TIMEOUT_MS(3) are set, the value set last will be used.
Since this puts a hard limit for how long time a request is allowed to take, it has limited use in dynamic use cases with varying transfer
times. You are then advised to explore CURLOPT_LOW_SPEED_LIMIT(3), CURLOPT_LOW_SPEED_TIME(3) or using CURLOPT_PROGRESSFUNCTION(3) to imple-
ment your own timeout logic.
DEFAULT
Default timeout is 0 (zero) which means it never times out during transfer.
PROTOCOLS
All
EXAMPLE
CURL *curl = curl_easy_init();
if(curl) {
curl_easy_setopt(curl, CURLOPT_URL, "http://example.com");
/* complete within 20 seconds */
curl_easy_setopt(curl, CURLOPT_TIMEOUT, 20L);
curl_easy_perform(curl);
}
AVAILABILITY
Always
RETURN VALUE
Returns CURLE_OK
SEE ALSO
CURLOPT_TIMEOUT_MS(3), CURLOPT_CONNECTTIMEOUT(3), CURLOPT_LOW_SPEED_LIMIT(3),
libcurl 7.54.0 February 03, 2016 CURLOPT_TIMEOUT(3)