October 2018
Beginner to intermediate
348 pages
10h
English
Once the configuration is all set, we import a table to local and register it as a table locally. This allows any kind of queries to be run. Refer to the SparkR API docs for further information at Spark: R API Docs: https://spark.apache.org/docs/latest/api/R/index.html.
The commands are as follows:
# for All in one docker its already installed# install.packages(c("sparklyr","dplyr"), repos="http://cran.us.r- #project.org")library(sparklyr)config <- spark_config()config$sparklyr.defaultPackages = "com.datastax.spark:spark-cassandra-connector_2.11:2.3.0"config$spark.driver.host = '127.0.0.1'config$spark.cassandra.connection.host = '127.0.0.1'config$spark.cassandra.auth.username = 'cassandra'config$spark.cassandra.auth.password ...