Running the Model on Hadoop

We have used BigML for creating the classification tree with the help of historical campaign response data. This model is available in the form of Java code. We can now run this model on Hadoop by using it inside a MapReduce job.

We will need a MapReduce program, which you are already familiar with from the previous chapters of this book. This program is supplied with this book in the directory /hbp/chapt4 as a file named This file contains the code for a mapper, reducer, and driver in a single file. The following classes are included in this file:

public class ResponsePrediction {...} public static void main(String[] args) throws Exception { JobConf conf = new JobConf(ResponsePrediction.class); ...

Get Hadoop Blueprints now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.