robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides a 'Robotstxt' ('R6') class and accompanying methods to parse and check 'robots.txt' files. Data fields are provided as data frames and vectors. Permissions can be checked by providing path character vectors and optional bot names.

Version: 0.1.2
Depends: R (≥ 3.0.0)
Imports: R6 (≥ 2.1.1), stringr (≥ 1.0.0), httr (≥ 1.0.0)
Suggests: knitr, rmarkdown, dplyr, testthat
Published: 2016-02-08
Author: Peter Meissner [aut, cre], Oliver Keys [ctb]
Maintainer: Peter Meissner <retep.meissner at>
License: MIT + file LICENSE
NeedsCompilation: no
Materials: README
CRAN checks: robotstxt results


Reference manual: robotstxt.pdf
Vignettes: using_robotstxt
Package source: robotstxt_0.1.2.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
OS X Snow Leopard binaries: r-release: robotstxt_0.1.2.tgz, r-oldrel: not available
OS X Mavericks binaries: r-release: robotstxt_0.1.2.tgz