Spark uses log4j for its own logging. All the operations that happen backend get logged to the Spark shell console (which is already configured to the underlying storage). Spark provides a template of log4j as a property file, and we can extend and modify that file for logging in Spark. Move to the SPARK_HOME/conf directory and you should see the log4j.properties.template file. This could help us as the starting point for our own logging system.
Now, let's create our own custom logging system while running a Spark job. When you are done, rename the file as log4j.properties and put it under the same directory (that is, project tree). A sample snapshot of the file can be seen as follows: