Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, scrapers, ...) are allowed to access specific resources on a domain.

Documentation

Manual: robotstxt.pdf
Vignette: using_robotstxt

Maintainer: Peter Meissner <retep.meissner at gmail.com>

Author(s): Peter Meissner*, Oliver Keys*, Rich Fitz John*

Install package and any missing dependencies by running this line in your R console:

install.packages("robotstxt")

Depends R (>= 3.0.0)
Imports stringr(>=1.0.0), httr(>=1.0.0)
Suggests knitr, rmarkdown, dplyr, testthat
Enhances
Linking to
Reverse
depends
Reverse
imports
Reverse
suggests
Reverse
enhances
Reverse
linking to

Package robotstxt
Materials
URL https://github.com/ropenscilabs/robotstxt
Task Views
Version 0.3.2
Published 2016-12-05
License MIT + file LICENSE
BugReports https://github.com/ropenscilabs/robotstxt/issues
SystemRequirements
NeedsCompilation no
Citation
CRAN checks robotstxt check results
Package source robotstxt_0.3.2.tar.gz