Package libwww-robotrules-perl in squeeze-backports (6.01-1~bpo60+1)

database of robots.txt-derived permissions WWW::RobotRules parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html>. Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.


Depends
perl, liburi-perl
Homepage
http://search.cpan.org/dist/WWW-RobotRules/
Maintainer
Debian Perl Group <pkg-perl-maintainers@lists.alioth.debian.org>
Architectures
all