How it works...

In this example, we show how to connect to a Spark cluster using the local and remote modes prior to Spark 2.0. First, we create a SparkConf object and configure all the required parameters. We will specify the master location, application name, and working data directory. Next, we create a SparkContext passing the SparkConf as an argument to access a Spark cluster. Also, you can specify the master location my passing a JVM argument when starting your client program. Finally, we execute a small sample program to prove our SparkContext is functioning correctly.

Get Apache Spark 2.x Machine Learning Cookbook now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.