R interface to Apache Spark, a fast and general engine for big data processing, see . This package supports connecting to local and remote Apache Spark clusters, provides a 'dplyr' compatible back-end, and provides an interface to Spark's built-in machine learning algorithms.

Documentation

Manual: sparklyr.pdf
Vignette: None available.

Maintainer: Javier Luraschi <javier at rstudio.com>

Author(s): Javier Luraschi*, Kevin Ushey*, JJ Allaire*, RStudio*, The Apache Software Foundation*

Install package and any missing dependencies by running this line in your R console:

install.packages("sparklyr")

Depends R (>= 3.1.2)
Imports methods, lazyeval(>=0.2.0), dplyr(>=0.7.0), dbplyr(>=1.0.0), DBI(>=0.6), readr(>=0.2.0), digest, config, rappdirs, assertthat, rprojroot, withr, httr, jsonlite, base64enc, rlang(>=0.1), rstudioapi, shiny(>=1.0.1)
Suggests testthat, RCurl, janeaustenr
Enhances
Linking to
Reverse
depends
Reverse
imports
rsparkling, spark.sas7bdat, sparkwarc
Reverse
suggests
replyr
Reverse
enhances
Reverse
linking to

Package sparklyr
Materials
URL http://spark.rstudio.com
Task Views
Version 0.5.6
Published 2017-06-10
License Apache License 2.0 | file LICENSE
BugReports https://github.com/rstudio/sparklyr/issues
SystemRequirements
NeedsCompilation no
Citation
CRAN checks sparklyr check results
Package source sparklyr_0.5.6.tar.gz