robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Version: 0.6.2
Depends: R (≥ 3.0.0)
Imports: stringr (≥ 1.0.0), httr (≥ 1.0.0), spiderbar (≥ 0.2.0), future (≥ 1.6.2), future.apply (≥ 1.0.0), magrittr, utils
Suggests: knitr, rmarkdown, dplyr, testthat, covr
Published: 2018-07-18
Author: Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb]
Maintainer: Peter Meissner <retep.meissner at gmail.com>
BugReports: https://github.com/ropensci/robotstxt/issues
License: MIT + file LICENSE
URL: https://github.com/ropensci/robotstxt
NeedsCompilation: no
Materials: README NEWS
In views: WebTechnologies
CRAN checks: robotstxt results

Downloads:

Reference manual: robotstxt.pdf
Vignettes: using_robotstxt
Package source: robotstxt_0.6.2.tar.gz
Windows binaries: r-devel: robotstxt_0.6.2.zip, r-release: robotstxt_0.6.2.zip, r-oldrel: robotstxt_0.6.2.zip
OS X binaries: r-release: robotstxt_0.6.2.tgz, r-oldrel: robotstxt_0.6.2.tgz
Old sources: robotstxt archive

Reverse dependencies:

Reverse suggests: rzeit2, spiderbar

Linking:

Please use the canonical form https://CRAN.R-project.org/package=robotstxt to link to this page.