Chapter 6. Optimizing MapReduce Tasks

Most MapReduce programs are written for data analysis and they usually take a lot of time to finish. Many companies are embracing Hadoop for advanced data analytics over large datasets that require completion-time guarantees. Efficiency, especially the I/O costs of MapReduce, still need to be addressed for successful implications.

In this chapter, we will discuss some optimization techniques such as using compression and using Combiners in order to improve job execution. Also in this chapter, you will learn basic guidelines and rules to optimize your mappers and reducers code, and techniques to use and reuse the object's instances.

The following topics will be covered in this chapter:

  • The benefits of using Combiners ...

Get Optimizing Hadoop for MapReduce now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.