R interface to Apache Spark, a fast and general engine for big data
processing, see <http://spark.apache.org>. This package supports connecting to
local and remote Apache Spark clusters, provides a 'dplyr' compatible back-end,
and provides an interface to Spark's built-in machine learning algorithms.
Version: |
1.0.1 |
Depends: |
R (≥ 3.2) |
Imports: |
assertthat, base64enc, config (≥ 0.2), DBI (≥ 0.6-1), dplyr (≥ 0.7.2), dbplyr (≥ 1.1.0), digest, forge, generics, httr (≥ 1.2.1), jsonlite (≥ 1.4), methods, openssl (≥ 0.8), purrr, r2d3, rappdirs, rlang (≥ 0.1.4), rprojroot, rstudioapi (≥ 0.6), tibble, tidyr, withr, xml2, ellipsis (≥ 0.1.0) |
Suggests: |
broom, ggplot2, glmnet, janeaustenr, Lahman, mlbench, nnet, nycflights13, R6, RCurl, reshape2, shiny (≥ 1.0.1), testthat |
Published: |
2019-05-17 |
Author: |
Javier Luraschi [aut, cre],
Kevin Kuo [aut],
Kevin Ushey [aut],
JJ Allaire [aut],
Samuel Macedo [ctb],
RStudio [cph],
The Apache Software Foundation [aut, cph] |
Maintainer: |
Javier Luraschi <javier at rstudio.com> |
BugReports: |
https://github.com/rstudio/sparklyr/issues |
License: |
Apache License 2.0 | file LICENSE |
URL: |
http://spark.rstudio.com |
NeedsCompilation: |
no |
SystemRequirements: |
Spark: 1.6.x or 2.x |
Materials: |
README NEWS |
In views: |
ModelDeployment |
CRAN checks: |
sparklyr results |