robot-pmaptest(1) General Commands Manual robot-pmaptest(1)NAME
robot-pmaptest - occupancy grid map creation tool
SYNOPSIS
robot-pmaptest [options] <logfilename>
DESCRIPTION
robot-pmaptest is utility demonstrates the basic functionality of the pmap library and serves a handy mapping utility in its own right.
Given a Player logfile containing odometry and laser data, robot-pmaptest will produce an occupancy grid map of the environment.
OPTIONS -g disable the GUI (run in console mode only).
--range_max range
maximum effective range for the laser in meter (default: range saved in logfile).
--position_index index
index of odometry device in logfile (defualt: 0).
--laser_index index
index of laser device in logfile (default: 0).
--num_samples number
number of samples in particle filter (default: 200).
--resample_interval number
number of scans between resampling steps.
--resample_sigma width
width of resampling gaussian.
--num_cycles number
number of optimization cycles in the fine phase (default: 100).
--robot_x position
initial position of the robot on the x-axis.
--robot_y position
initial position of the robot on the y-axis.
--robot_rot rotation
initial rotation of the robot in degrees.
--grid_width width
width of the grid in meters (default: 64.0).
--grid_height height
height of the grid in meters (default: 48.0).
--grid_scale scale
scale of the grid in meters per cell (default: 0.10).
--laser_x position
position of the laser scanner on the robot.
--laser_rot rotation
rotation of the laser scanner on the robot in degrees.
--robot_hostname hostname
the hostname of the robot to verify in the logfile.
--robot_hostname hostname
the hostname of the robot to verify in the logfile.
--skip time
amount of time to skip between log entries.
--range_resresolution
resolution of the laser (only used inlodo, not lodo2 which is currentlyused).
--action_model_xx factor
believe factors in the change of the robot's pose.
--action_model_rx factor
believe factors in the change of the robot's pose.
--action_model_rr factor
believe factors in the change of the robot's pose.
AUTHOR
Player was written by Brian Gerkey <gerkey@users.sourceforge.net> and contributors. This manual page was written by Daniel Hess for the
Debian Project.
SEE ALSO
The HTML documentation in /usr/share/doc/player/html of the robot-player-doc package. robot-playervcr(1)Player May 2009 robot-pmaptest(1)
Check Out this Related Man Page
WWW::RobotRules(3) User Contributed Perl Documentation WWW::RobotRules(3)NAME
WWW::RobotRules - database of robots.txt-derived permissions
SYNOPSIS
use WWW::RobotRules;
my $rules = WWW::RobotRules->new('MOMspider/1.0');
use LWP::Simple qw(get);
{
my $url = "http://some.place/robots.txt";
my $robots_txt = get $url;
$rules->parse($url, $robots_txt) if defined $robots_txt;
}
{
my $url = "http://some.other.place/robots.txt";
my $robots_txt = get $url;
$rules->parse($url, $robots_txt) if defined $robots_txt;
}
# Now we can check if a URL is valid for those servers
# whose "robots.txt" files we've gotten and parsed:
if($rules->allowed($url)) {
$c = get $url;
...
}
DESCRIPTION
This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>
Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited.
The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
The following methods are provided:
$rules = WWW::RobotRules->new($robot_name)
This is the constructor for WWW::RobotRules objects. The first argument given to new() is the name of the robot.
$rules->parse($robot_txt_url, $content, $fresh_until)
The parse() method takes as arguments the URL that was used to retrieve the /robots.txt file, and the contents of the file.
$rules->allowed($uri)
Returns TRUE if this robot is allowed to retrieve this URL.
$rules->agent([$name])
Get/set the agent name. NOTE: Changing the agent name will clear the robots.txt rules and expire times out of the cache.
ROBOTS.TXT
The format and semantics of the "/robots.txt" file are as follows (this is an edited abstract of
<http://www.robotstxt.org/wc/norobots.html>):
The file consists of one or more records separated by one or more blank lines. Each record contains lines of the form
<field-name>: <value>
The field name is case insensitive. Text after the '#' character on a line is ignored during parsing. This is used for comments. The
following <field-names> can be used:
User-Agent
The value of this field is the name of the robot the record is describing access policy for. If more than one User-Agent field is
present the record describes an identical access policy for more than one robot. At least one field needs to be present per record. If
the value is '*', the record describes the default access policy for any robot that has not not matched any of the other records.
The User-Agent fields must occur before the Disallow fields. If a record contains a User-Agent field after a Disallow field, that
constitutes a malformed record. This parser will assume that a blank line should have been placed before that User-Agent field, and
will break the record into two. All the fields before the User-Agent field will constitute a record, and the User-Agent field will be
the first field in a new record.
Disallow
The value of this field specifies a partial URL that is not to be visited. This can be a full path, or a partial path; any URL that
starts with this value will not be retrieved
Unrecognized records are ignored.
ROBOTS.TXT EXAMPLES
The following example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/" or "/tmp/":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
Disallow: /tmp/ # these will soon disappear
This example "/robots.txt" file specifies that no robots should visit any URL starting with "/cyberworld/map/", except the robot called
"cybermapper":
User-agent: *
Disallow: /cyberworld/map/ # This is an infinite virtual URL space
# Cybermapper knows where to go.
User-agent: cybermapper
Disallow:
This example indicates that no robots should visit this site further:
# go away
User-agent: *
Disallow: /
This is an example of a malformed robots.txt file.
# robots.txt for ancientcastle.example.com
# I've locked myself away.
User-agent: *
Disallow: /
# The castle is your home now, so you can go anywhere you like.
User-agent: Belle
Disallow: /west-wing/ # except the west wing!
# It's good to be the Prince...
User-agent: Beast
Disallow:
This file is missing the required blank lines between records. However, the intention is clear.
SEE ALSO
LWP::RobotUA, WWW::RobotRules::AnyDBM_File
perl v5.12.1 2009-10-03 WWW::RobotRules(3)