www::robotrules::anydbm_file(3) mojave man page | unix.com

Man Page: www::robotrules::anydbm_file

Operating Environment: mojave

Section: 3

WWW::RobotRules::AnyDBM_File(3) 			User Contributed Perl Documentation			   WWW::RobotRules::AnyDBM_File(3)

NAME
WWW::RobotRules::AnyDBM_File - Persistent RobotRules
SYNOPSIS
require WWW::RobotRules::AnyDBM_File; require LWP::RobotUA; # Create a robot useragent that uses a diskcaching RobotRules my $rules = WWW::RobotRules::AnyDBM_File->new( 'my-robot/1.0', 'cachefile' ); my $ua = WWW::RobotUA->new( 'my-robot/1.0', 'me@foo.com', $rules ); # Then just use $ua as usual $res = $ua->request($req);
DESCRIPTION
This is a subclass of WWW::RobotRules that uses the AnyDBM_File package to implement persistent diskcaching of robots.txt and host visit information. The constructor (the new() method) takes an extra argument specifying the name of the DBM file to use. If the DBM file already exists, then you can specify undef as agent name as the name can be obtained from the DBM database.
SEE ALSO
WWW::RobotRules, LWP::RobotUA
AUTHORS
Hakan Ardo <hakan@munin.ub2.lu.se>, Gisle Aas <aas@sn.no> perl v5.18.2 2012-02-15 WWW::RobotRules::AnyDBM_File(3)
Related Man Pages
www::robotrules(3) - mojave
www::robotrules(3) - suse
lwp::robotua(3) - suse
lwp::robotua(3pm) - debian
www::robotrules(3pm) - debian
Similar Topics in the Unix Linux Community
Instructions to Clear Data Cache in Safari, Chrome, Firefox, Opera Browsers (Pictures)
12-Core MacPro (2013) kernel_task over 1200%
Shopt -s histappend
Cannot extract libraries using sed
Python 3.8.0rc1 released.