R interface to Apache Spark, a fast and general engine for big data
processing, see <http://spark.apache.org>. This package supports connecting to
local and remote Apache Spark clusters, provides a 'dplyr' compatible back-end,
and provides an interface to Spark's built-in machine learning algorithms.
Version: |
0.6.4 |
Depends: |
R (≥ 3.1.2) |
Imports: |
assertthat, base64enc, broom, config (≥ 0.2), DBI (≥ 0.6-1), dplyr (≥ 0.7.2), dbplyr (≥ 1.1.0), digest, httr (≥ 1.2.1), jsonlite (≥ 1.4), lazyeval (≥ 0.2.0), methods, openssl (≥
0.8), rappdirs, readr (≥ 1.1.0), rlang (≥ 0.1), rprojroot, rstudioapi, shiny (≥ 1.0.1), withr, xml2 |
Suggests: |
ggplot2, janeaustenr, nycflights13, testthat, RCurl |
Published: |
2017-11-02 |
Author: |
Javier Luraschi [aut, cre],
Kevin Ushey [aut],
JJ Allaire [aut],
RStudio [cph],
The Apache Software Foundation [aut, cph] |
Maintainer: |
Javier Luraschi <javier at rstudio.com> |
BugReports: |
https://github.com/rstudio/sparklyr/issues |
License: |
Apache License 2.0 | file LICENSE |
URL: |
http://spark.rstudio.com |
NeedsCompilation: |
no |
SystemRequirements: |
Spark: 1.6.x or 2.x |
Materials: |
README NEWS |
CRAN checks: |
sparklyr results |