Spark standalone uses a built-in scheduler without depending on any external scheduler such as YARN or Mesos. To install Spark in standalone mode, you have to copy the spark binary install package onto all the machines in the cluster.
In standalone mode, the client can interact with the cluster, either through spark-submit or Spark shell. In either case, the Driver communicates with the Spark master Node to get the worker nodes, where executors can be started for this application.
The following is the standalone deployment of Spark using Master node and worker nodes:
Let's now ...