In this section, we will configure our Spark cluster so that we can deploy and execute our Spark application.
Spark essentially enables the distributed execution of a given piece of code. Though we will talk about Spark architecture in the next chapter, let's briefly talk about the major components which need to be configured for setting up the Spark cluster.
The following are the high-level components involved in setting up the Spark cluster:
SparkContext. It connects to the cluster manager and requests resources for further execution of the jobs in distributed mode.