Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, scrapers, ...) are allowed to access specific resources on a domain.
Version: | 0.3.2 |
Depends: | R (≥ 3.0.0) |
Imports: | stringr (≥ 1.0.0), httr (≥ 1.0.0) |
Suggests: | knitr, rmarkdown, dplyr, testthat |
Published: | 2016-04-28 |
Author: | Peter Meissner [aut, cre], Oliver Keys [ctb], Rich Fitz John [ctb] |
Maintainer: | Peter Meissner <retep.meissner at gmail.com> |
BugReports: | https://github.com/ropenscilabs/robotstxt/issues |
License: | MIT + file LICENSE |
URL: | https://github.com/ropenscilabs/robotstxt |
NeedsCompilation: | no |
Materials: | README NEWS |
CRAN checks: | robotstxt results |
Reference manual: | robotstxt.pdf |
Vignettes: |
using_robotstxt |
Package source: | robotstxt_0.3.2.tar.gz |
Windows binaries: | r-devel: robotstxt_0.3.2.zip, r-release: robotstxt_0.3.2.zip, r-oldrel: robotstxt_0.3.2.zip |
OS X Mavericks binaries: | r-release: robotstxt_0.3.2.tgz, r-oldrel: robotstxt_0.3.2.tgz |
Old sources: | robotstxt archive |
Please use the canonical form https://CRAN.R-project.org/package=robotstxt to link to this page.