aboutsummaryrefslogtreecommitdiff
path: root/www/Makefile
diff options
context:
space:
mode:
authorFrederic Culot <culot@FreeBSD.org>2011-04-14 13:18:39 +0000
committerFrederic Culot <culot@FreeBSD.org>2011-04-14 13:18:39 +0000
commit3395f956808458a4a02bb6e6a72b6663c5c74773 (patch)
tree20c10dd8a1fb49d9f0f14d8ff68b50a74deab4f5 /www/Makefile
parent4a17519108015444fc7064f519c2478793e47421 (diff)
downloadports-3395f956808458a4a02bb6e6a72b6663c5c74773.tar.gz
ports-3395f956808458a4a02bb6e6a72b6663c5c74773.zip
WWW::RobotRules parses /robots.txt files which are used to forbid conforming
robots from accessing parts of a web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. WWW: http://search.cpan.org/dist/WWW-RobotRules/ This new port is needed to update www/p5-libwww.
Notes
Notes: svn path=/head/; revision=272722
Diffstat (limited to 'www/Makefile')
-rw-r--r--www/Makefile1
1 files changed, 1 insertions, 0 deletions
diff --git a/www/Makefile b/www/Makefile
index 364ac3bf1cda..338ba128bfdc 100644
--- a/www/Makefile
+++ b/www/Makefile
@@ -1286,6 +1286,7 @@
SUBDIR += p5-WWW-Pastebin-PastebinCom-Create
SUBDIR += p5-WWW-Plurk
SUBDIR += p5-WWW-Robot
+ SUBDIR += p5-WWW-RobotRules
SUBDIR += p5-WWW-RobotRules-Parser
SUBDIR += p5-WWW-Scraper-ISBN
SUBDIR += p5-WWW-Scraper-ISBN-Amazon_Driver