robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, scrapers, ...) are allowed to access specific resources on a domain.

Version: 0.3.2
Depends: R (≥ 3.0.0)
Imports: stringr (≥ 1.0.0), httr (≥ 1.0.0)
Suggests: knitr, rmarkdown, dplyr, testthat
Published: 2016-04-28
Author: Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Maintainer: Peter Meissner <retep.meissner at>
License: MIT + file LICENSE
NeedsCompilation: no
Materials: README NEWS
CRAN checks: robotstxt results


Reference manual: robotstxt.pdf
Vignettes: using_robotstxt
Package source: robotstxt_0.3.2.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
OS X Mavericks binaries: r-release: robotstxt_0.3.2.tgz, r-oldrel: robotstxt_0.3.2.tgz
Old sources: robotstxt archive


Please use the canonical form to link to this page.