Name : perl-WWW-RobotRules
| |
Version : 6.20.0
| Vendor : (none)
|
Release : 9pclos2017
| Date : 2017-07-13 05:08:58
|
Group : Development/Perl
| Source RPM : perl-WWW-RobotRules-6.20.0-9pclos2017.src.rpm
|
Size : 0.03 MB
| |
Packager : bb2 < bb2>
| |
Summary : Parse /robots.txt file
|
Description :
This module parses _/robots.txt_ files as specified in \"A Standard for Robot Exclusion\", at < http://www.robotstxt.org/wc/norobots.html> Webmasters can use the _/robots.txt_ file to forbid conforming robots from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed _/robots.txt_ files on any number of hosts.
The following methods are provided:
|
RPM found in directory: /vol/rzm3/linux-pclinuxos/pclinuxos/apt/pclinuxos/64bit/RPMS.x86_64 |