Sponsored Content
Operating Systems Linux Learning scrapers, webcrawlers, search engines and CURL Post 303019102 by Neo on Friday 22nd of June 2018 11:16:30 PM
Old 06-23-2018
Quote:
Originally Posted by TBotNik
  • Text only vs regular brower: which is best?
  • wget vs php fileopen vs CURL: Which is best?
  • HTML tag find/parse: Are there libraries that effectively do this?
  • HTML tag find/parse: Is REGEX the best way to parse these? Where are examples?
  • Checking for the new meta-tags of:
I think you are better off to get web page content using PHP scripts and parse the files with REGEX.

If you Google around, I am sure you can find many sample PHP scripts that do most of what you want. This is very old technology and there is no need to reinvent the wheel parsing HTML data.
 

3 More Discussions You Might Find Interesting

1. Shell Programming and Scripting

I dont want to know any search engines

I just want to know where I can download it on this website plz (1 Reply)
Discussion started by: memattmyself
1 Replies

2. UNIX for Dummies Questions & Answers

Using cURL to save online search results

Hi, I'm attacking this from ignorance because I am not sure how to even ask the question. Here is the mission: I have a list of about 4,000 telephone numbers for past customers. I need to determine how many of these customers are still in business. Obviously, I could call all the numbers.... (0 Replies)
Discussion started by: jccbin
0 Replies

3. Shell Programming and Scripting

Checking status of engines using C-shell

I am relatively new to scripting. I am trying to develop a script that will 1. Source an executable file as an argument to the script that sets up the environment 2. Run a command "stat" that gives the status of 5 Engines running on the system 3. Check the status of the 5 Engines as either... (0 Replies)
Discussion started by: paslas
0 Replies
CURLINFO_SSL_ENGINES(3) 				     curl_easy_getinfo options					   CURLINFO_SSL_ENGINES(3)

NAME
CURLINFO_SSL_ENGINES - get an slist of OpenSSL crypto-engines SYNOPSIS
#include <curl/curl.h> CURLcode curl_easy_getinfo(CURL *handle, CURLINFO_SSL_ENGINES, struct curl_slist **engine_list); DESCRIPTION
Pass the address of a 'struct curl_slist *' to receive a linked-list of OpenSSL crypto-engines supported. Note that engines are normally implemented in separate dynamic libraries. Hence not all the returned engines may be available at run-time. NOTE: you must call curl_slist_free_all(3) on the list pointer once you're done with it, as libcurl will not free the data for you. PROTOCOLS
All TLS based ones. EXAMPLE
TODO AVAILABILITY
Added in 7.12.3. Available in OpenSSL builds with "engine" support. RETURN VALUE
Returns CURLE_OK if the option is supported, and CURLE_UNKNOWN_OPTION if not. SEE ALSO
curl_easy_getinfo(3), curl_easy_setopt(3), libcurl 7.54.0 February 03, 2016 CURLINFO_SSL_ENGINES(3)
All times are GMT -4. The time now is 05:57 AM.
Unix & Linux Forums Content Copyright 1993-2022. All Rights Reserved.
Privacy Policy