Name : perl-WWW-RobotRules
| |
Version : 6.20.0
| Vendor : Mageia_Org
|
Release : 10.mga8
| Date : 2020-02-12 08:37:25
|
Group : Development/Perl
| Source RPM : perl-WWW-RobotRules-6.20.0-10.mga8.src.rpm
|
Size : 0.03 MB
| |
Packager : umeabot < umeabot>
| |
Summary : Parse /robots.txt file
|
Description :
This module parses _/robots.txt_ files as specified in \"A Standard for Robot Exclusion\", at < http://www.robotstxt.org/wc/norobots.html> Webmasters can use the _/robots.txt_ file to forbid conforming robots from accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed _/robots.txt_ files on any number of hosts.
The following methods are provided:
|
RPM found in directory: /vol/rzm3/linux-mageia/distrib/8/x86_64/media/core/release |