robotstxt: A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Version: 0.7.13
Depends: R (≥ 3.0.0)
Imports: stringr (≥ 1.0.0), httr (≥ 1.0.0), spiderbar (≥ 0.2.0), future (≥ 1.6.2), future.apply (≥ 1.0.0), magrittr, utils
Suggests: knitr, rmarkdown, dplyr, testthat, covr
Published: 2020-09-03
DOI: 10.32614/CRAN.package.robotstxt
Author: Peter Meissner [aut, cre], Kun Ren [aut, cph] (Author and copyright holder of list_merge.R.), Oliver Keys [ctb] (original release code review), Rich Fitz John [ctb] (original release code review)
Maintainer: Peter Meissner <retep.meissner at>
License: MIT + file LICENSE
NeedsCompilation: no
Materials: README NEWS
In views: WebTechnologies
CRAN checks: robotstxt results


Reference manual: robotstxt.pdf
Vignettes: using_robotstxt


Package source: robotstxt_0.7.13.tar.gz
Windows binaries: r-devel:, r-release:, r-oldrel:
macOS binaries: r-release (arm64): robotstxt_0.7.13.tgz, r-oldrel (arm64): robotstxt_0.7.13.tgz, r-release (x86_64): robotstxt_0.7.13.tgz, r-oldrel (x86_64): robotstxt_0.7.13.tgz
Old sources: robotstxt archive

Reverse dependencies:

Reverse imports: polite, ralger
Reverse suggests: newsanchor, spiderbar, vosonSML, webchem


Please use the canonical form to link to this page.