Executing the Map Reduce program in a Hadoop cluster
In the previous recipe, we took a look at how to write a map reduce program for a page view counter. In this recipe, we will explore how to execute this in a Hadoop cluster.
Getting ready
To perform this recipe, you should already have a running Hadoop cluster as well as an eclipse similar to an IDE.
How to do it
To execute the program, we first need to create a JAR file of it. JAR stands for Java Archive file, which contains compiled class files. To create a JAR file in eclipse, we need to perform the following steps:
- Right-click on the project where you've written your Map Reduce Program. Then, click on Export.
- Select Java->Jar File and click on the Next button. Browse through the path where you ...
Get Hadoop: Data Processing and Modelling now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.