R interface to Apache Spark, a fast and general engine for big data
processing, see <http://spark.apache.org>. This package supports connecting to
local and remote Apache Spark clusters, provides a 'dplyr' compatible back-end,
and provides an interface to Spark's built-in machine learning algorithms.
Version: |
0.5.1 |
Depends: |
R (≥ 3.1.2) |
Imports: |
methods, lazyeval (≥ 0.2.0), dplyr (≥ 0.5.0), DBI (≥
0.4.1), readr (≥ 0.2.0), digest, config, rappdirs, assertthat, rprojroot, withr, httr, jsonlite, base64enc |
Suggests: |
testthat, RCurl, janeaustenr |
Published: |
2016-12-19 |
Author: |
Javier Luraschi [aut, cre],
Kevin Ushey [aut],
JJ Allaire [aut],
RStudio [cph],
The Apache Software Foundation [aut, cph] |
Maintainer: |
Javier Luraschi <javier at rstudio.com> |
BugReports: |
https://github.com/rstudio/sparklyr/issues |
License: |
Apache License 2.0 | file LICENSE |
URL: |
http://spark.rstudio.com |
NeedsCompilation: |
no |
Materials: |
README NEWS |
CRAN checks: |
sparklyr results |