--- - branch: MAIN date: Sun Jul 10 12:47:38 UTC 2011 files: - new: 1.1.1.1 old: '0' path: pkgsrc/www/p5-WWW-RobotRules/Makefile pathrev: pkgsrc/www/p5-WWW-RobotRules/Makefile@1.1.1.1 type: imported - new: 1.1.1.1 old: '0' path: pkgsrc/www/p5-WWW-RobotRules/DESCR pathrev: pkgsrc/www/p5-WWW-RobotRules/DESCR@1.1.1.1 type: imported - new: 1.1.1.1 old: '0' path: pkgsrc/www/p5-WWW-RobotRules/distinfo pathrev: pkgsrc/www/p5-WWW-RobotRules/distinfo@1.1.1.1 type: imported id: 20110710T124738Z.d216174322848f76ce46ab962e6c59f78316dcc6 log: "The Perl 5 module WWW::RobotRules parses /robots.txt files as specified\nin \"A Standard for Robot Exclusion\", at\nhttp://www.robotstxt.org/wc/norobots.htmls\nWebmasters can use the /robots.txt file to forbid conforming robots\nfrom accessing parts of their web site.\n\nThe parsed files are kept in a WWW::RobotRules object, and this object\nprovides methods to check if access to a given URL is prohibited.\nThe same WWW::RobotRules object can be used for one or more parsed\n/robots.txt files on any number of hosts.\n\nStatus:\n\nVendor Tag:\tTNF\nRelease Tags:\tpkgsrc-base\n" module: pkgsrc subject: 'CVS commit: pkgsrc/www/p5-WWW-RobotRules' unixtime: '1310302058' user: spz