Chapter 8. Running Spark Jobs
In this chapter, we will see how to run Spark jobs from Oozie. Spark has changed the whole ecosystem of Hadoop and the Big Data world. It can be used as ETL tool or machine learning tool, and it can be used where traditionally we use Pig, Hive, or Sqoop.
In this chapter, we will:
- Create Oozie Workflow for Spark actions
From the concept point of view, we will:
- Understand the concept of Bundles
We will start off with a simple Workflow in which we will rewrite the same Pig logic of finding maximum rainfall in a given month in Spark and then we will schedule that using Oozie Workflow and Coordinators. The idea is to show the beauty of Spark—how seamlessly it replaces various tools such as Pig or Hive, and how it has become ...
Get Apache Oozie Essentials now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.