Linux and UNIX Man Pages

Linux & Unix Commands - Search Man Pages

curlinfo_ssl_engines(3) [mojave man page]

CURLINFO_SSL_ENGINES(3) 				     curl_easy_getinfo options					   CURLINFO_SSL_ENGINES(3)

NAME
CURLINFO_SSL_ENGINES - get an slist of OpenSSL crypto-engines SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_getinfo(CURL *handle, CURLINFO_SSL_ENGINES, struct curl_slist **engine_list); DESCRIPTION
Pass the address of a 'struct curl_slist *' to receive a linked-list of OpenSSL crypto-engines supported. Note that engines are normally implemented in separate dynamic libraries. Hence not all the returned engines may be available at run-time. NOTE: you must call curl_slist_free_all(3) on the list pointer once you're done with it, as libcurl will not free the data for you. PROTOCOLS
All TLS based ones. EXAMPLE
TODO AVAILABILITY
Added in 7.12.3. Available in OpenSSL builds with "engine" support. RETURN VALUE
Returns CURLE_OK if the option is supported, and CURLE_UNKNOWN_OPTION if not. SEE ALSO
curl_easy_getinfo(3), curl_easy_setopt(3), libcurl 7.54.0 February 03, 2016 CURLINFO_SSL_ENGINES(3)

Check Out this Related Man Page

CURLINFO_EFFECTIVE_URL(3)				     curl_easy_getinfo options					 CURLINFO_EFFECTIVE_URL(3)

NAME
CURLINFO_EFFECTIVE_URL - get the last used URL SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_getinfo(CURL *handle, CURLINFO_EFFECTIVE_URL, char **urlp); DESCRIPTION
Pass in a pointer to a char pointer and get the last used effective URL. In cases when you've asked libcurl to follow redirects, it may very well not be the same value you set with CURLOPT_URL(3). The urlp pointer will be NULL or pointing to private memory you MUST NOT free - it gets freed when you call curl_easy_cleanup(3) on the corresponding CURL handle. PROTOCOLS
HTTP(S) EXAMPLE
TODO AVAILABILITY
Added in 7.4 RETURN VALUE
Returns CURLE_OK if the option is supported, and CURLE_UNKNOWN_OPTION if not. SEE ALSO
curl_easy_getinfo(3), curl_easy_setopt(3), libcurl 7.54.0 February 03, 2016 CURLINFO_EFFECTIVE_URL(3)
Man Page

2 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

I dont want to know any search engines

I just want to know where I can download it on this website plz (1 Reply)
Discussion started by: memattmyself
1 Replies

2. Linux

Learning scrapers, webcrawlers, search engines and CURL

All, I'm trying to learn scrapers, webcrawlers, search engines and CURL. I've chosen to interrogate the following sites: Manta, SuperPages, Yellow Book, Yellow Pages. These show organizations/businesses by search type/category, so effective in finding potential clients. ... (3 Replies)
Discussion started by: TBotNik
3 Replies