8 THE NAIVE BAYES CLASSIFIER
In this chapter, we introduce the naive Bayes classifier, which can be applied to data with categorical predictors. We review the concept of conditional probabilities, then present the complete, or exact, Bayesian classifier. We next see how it is impractical in most cases, and learn how to modify it and instead use the “naive Bayes” classifier, which is more generally applicable.
Naive Bayes in JMP: Naive Bayes is only available in JMP Pro.
8.1 INTRODUCTION
The naive Bayes method (and, indeed, an entire branch of statistics) is named after the Reverend Thomas Bayes (1702–1761). To understand the naive Bayes classifier, we first look at the complete, or exact, Bayesian classifier. The basic principle is simple.
For each new record to be classified:
- Find all of the training records with the same predictor profile (i.e., records having the same predictor values).
- Determine what classes the records belong to and which class is most prevalent.
- Assign that class to the new record.
Alternatively (or in addition), it may be desirable to adjust the method so that it answers the question: “What is the propensity of belonging to the class of interest?” instead of “Which class is the most probable?” Obtaining class probabilities allows using a sliding threshold to classify a record as belonging to class i, even if i is not the most probable class for that record. This approach is useful when there is a specific class of interest that we want to identify ...
Get Machine Learning for Business Analytics, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.