Wed May 10 21:59:57 2023 UTC ()
py-robot-detection: add version 0.4

robot_detection is a python module to detect if a given HTTP User Agent
is a web crawler. It uses the list of registered robots from
http://www.robotstxt.org: (Robots Database)


(markd)
diff -r1.1709 -r1.1710 pkgsrc/www/Makefile
diff -r0 -r1.1 pkgsrc/www/py-robot-detection/DESCR
diff -r0 -r1.1 pkgsrc/www/py-robot-detection/Makefile
diff -r0 -r1.1 pkgsrc/www/py-robot-detection/PLIST
diff -r0 -r1.1 pkgsrc/www/py-robot-detection/distinfo

cvs diff -r1.1709 -r1.1710 pkgsrc/www/Makefile (expand / switch to unified diff)

--- pkgsrc/www/Makefile 2023/05/05 11:01:09 1.1709
+++ pkgsrc/www/Makefile 2023/05/10 21:59:57 1.1710
@@ -1,14 +1,14 @@ @@ -1,14 +1,14 @@
1# $NetBSD: Makefile,v 1.1709 2023/05/05 11:01:09 adam Exp $ 1# $NetBSD: Makefile,v 1.1710 2023/05/10 21:59:57 markd Exp $
2# 2#
3 3
4COMMENT= Packages related to the World Wide Web 4COMMENT= Packages related to the World Wide Web
5 5
6SUBDIR+= R-RCurl 6SUBDIR+= R-RCurl
7SUBDIR+= R-bslib 7SUBDIR+= R-bslib
8SUBDIR+= R-curl 8SUBDIR+= R-curl
9SUBDIR+= R-diffviewer 9SUBDIR+= R-diffviewer
10SUBDIR+= R-downlit 10SUBDIR+= R-downlit
11SUBDIR+= R-gargle 11SUBDIR+= R-gargle
12SUBDIR+= R-gh 12SUBDIR+= R-gh
13SUBDIR+= R-googledrive 13SUBDIR+= R-googledrive
14SUBDIR+= R-htmlwidgets 14SUBDIR+= R-htmlwidgets
@@ -836,26 +836,27 @@ SUBDIR+= py-protego @@ -836,26 +836,27 @@ SUBDIR+= py-protego
836SUBDIR+= py-publicsuffix2 836SUBDIR+= py-publicsuffix2
837SUBDIR+= py-purl 837SUBDIR+= py-purl
838SUBDIR+= py-pylint-django 838SUBDIR+= py-pylint-django
839SUBDIR+= py-pystache 839SUBDIR+= py-pystache
840SUBDIR+= py-python-mimeparse 840SUBDIR+= py-python-mimeparse
841SUBDIR+= py-python-multipart 841SUBDIR+= py-python-multipart
842SUBDIR+= py-python3-digest 842SUBDIR+= py-python3-digest
843SUBDIR+= py-raven 843SUBDIR+= py-raven
844SUBDIR+= py-recaptcha 844SUBDIR+= py-recaptcha
845SUBDIR+= py-requests-wsgi-adapter 845SUBDIR+= py-requests-wsgi-adapter
846SUBDIR+= py-respx 846SUBDIR+= py-respx
847SUBDIR+= py-rfc3986 847SUBDIR+= py-rfc3986
848SUBDIR+= py-robobrowser 848SUBDIR+= py-robobrowser
 849SUBDIR+= py-robot-detection
849SUBDIR+= py-rss2gen 850SUBDIR+= py-rss2gen
850SUBDIR+= py-sanic 851SUBDIR+= py-sanic
851SUBDIR+= py-sanic-routing 852SUBDIR+= py-sanic-routing
852SUBDIR+= py-scgi 853SUBDIR+= py-scgi
853SUBDIR+= py-scrapy 854SUBDIR+= py-scrapy
854SUBDIR+= py-selenium 855SUBDIR+= py-selenium
855SUBDIR+= py-sigal 856SUBDIR+= py-sigal
856SUBDIR+= py-simpletal 857SUBDIR+= py-simpletal
857SUBDIR+= py-soupsieve 858SUBDIR+= py-soupsieve
858SUBDIR+= py-sparqlwrapper 859SUBDIR+= py-sparqlwrapper
859SUBDIR+= py-swiftclient 860SUBDIR+= py-swiftclient
860SUBDIR+= py-swish-e 861SUBDIR+= py-swish-e
861SUBDIR+= py-telepath 862SUBDIR+= py-telepath

File Added: pkgsrc/www/py-robot-detection/DESCR
robot_detection is a python module to detect if a given HTTP User Agent
is a web crawler. It uses the list of registered robots from
http://www.robotstxt.org: (Robots Database)

File Added: pkgsrc/www/py-robot-detection/Makefile
# $NetBSD: Makefile,v 1.1 2023/05/10 21:59:57 markd Exp $

DISTNAME=	robot-detection-0.4
PKGNAME=	${PYPKGPREFIX}-${DISTNAME}
CATEGORIES=	www python
MASTER_SITES=	${MASTER_SITE_PYPI:=r/robot-detection/}

MAINTAINER=	pkgsrc-users@NetBSD.org
HOMEPAGE=	https://github.com/rory/robot-detection/
COMMENT=	Detect web crawlers using HTTP User Agent
LICENSE=	gnu-gpl-v3

DEPENDS+=	${PYPKGPREFIX}-six-[0-9]*:../../lang/py-six

USE_LANGUAGES=	# none

PYTHON_VERSIONS_INCOMPATIBLE=	27

.include "../../lang/python/egg.mk"
.include "../../mk/bsd.pkg.mk"

File Added: pkgsrc/www/py-robot-detection/PLIST
@comment $NetBSD: PLIST,v 1.1 2023/05/10 21:59:57 markd Exp $
${PYSITELIB}/${EGG_INFODIR}/PKG-INFO
${PYSITELIB}/${EGG_INFODIR}/SOURCES.txt
${PYSITELIB}/${EGG_INFODIR}/dependency_links.txt
${PYSITELIB}/${EGG_INFODIR}/requires.txt
${PYSITELIB}/${EGG_INFODIR}/top_level.txt
${PYSITELIB}/robot_detection.py
${PYSITELIB}/robot_detection.pyc
${PYSITELIB}/robot_detection.pyo

File Added: pkgsrc/www/py-robot-detection/distinfo
$NetBSD: distinfo,v 1.1 2023/05/10 21:59:57 markd Exp $

BLAKE2s (robot-detection-0.4.tar.gz) = 3426065d5e685b356e53f0134db577b1a983541706ee89e68abdc3bee3ed0c3b
SHA512 (robot-detection-0.4.tar.gz) = c6b9979143a4dbe68a55c2ef47d3e029c66ec758d4ba17150199ae932201dda5b8e3c31aba092f8a7889fec38b3f647f161db813f6fca12b6c7404deb3726c08
Size (robot-detection-0.4.tar.gz) = 6387 bytes