Lifecycle of Spark program

The following steps explain the lifecycle of a Spark application with standalone resource manager, and Figure 3.8 shows the scheduling process of a spark program:

  1. The user submits a spark application using the spark-submit command.
  2. Spark-submit launches the driver program on the same node in (client mode) or on the cluster (cluster mode) and invokes the main method specified by the user.
  3. The driver program contacts the cluster manager to ask for resources to launch executor JVMs based on the configuration parameters supplied.
  4. The cluster manager launches executor JVMs on worker nodes.
  5. The driver process scans through the user application. Based on the RDD actions and transformations in the program, Spark creates an operator ...

Get Big Data Analytics now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.