Managing MapReduce jobs

The Hadoop Big Data platform accepts jobs submitted by clients. In a multiuser environment, multiple jobs can be submitted and run simultaneously. The management of Hadoop jobs include checking job status, changing the priority of jobs, killing a running job, and so on. In this recipe, we will outline the steps to do these job management tasks.

Getting ready

We assume that our Hadoop cluster has been configured properly and all the Hadoop daemons are running without any issues. We also assume that a regular user can submit Hadoop jobs to the cluster.

Log in to the master node from the cluster administrator machine with the following command:

ssh hduser@master

How to do it...

Perform the following steps to check the status of ...

Get Hadoop Operations and Cluster Management Cookbook now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.