O'Reilly logo

Apache Spark 2.x for Java Developers by Sumit Kumar, Sourav Gulati

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Spark job configuration and submission

When a Spark job is launched, it creates a SparkConf object and passes it to the constructor of SparkContext. The SparkConf() object contains a near exhaustive list of customizable parameters that can tune a Spark job as per cluster resources. The SparkConf object becomes immutable once it is passed to invoke a SparkContext() constructor, hence it becomes important to not only identify, but also modify all the SparkConf parameters before creating a SparkContext object.

There are different ways in which Spark job can be configured.

Spark's conf directory provides the default configurations to execute a Spark job. The SPARK_CONF_DIR parameter can be used to override the default location of the conf directory, ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required