In this chapter we first present the complete, or exact, Bayesian classifier. We see that it can be used either to maximize overall classification accuracy or in the case where we are interested mainly in identifying records belonging to a particular class of interest. We next see how it is impractical in most cases and learn how to modify it (the "naive Bayesian classifier") so that it is generally applicable. The naive Bayesian classifier can be used only with categorical variables.

The naive Bayes method (and, indeed, an entire branch of statistics) is named after the Reverend Thomas Bayes (1702–1761). To understand the naive Bayes classifier, we first look at the complete, or exact, Bayesian classifier. The basic principle is simple. For each record to be classified:

Find all the other records just like it (i.e., where the predictor values are the same).

Determine what classes they all belong to and which class is more prevalent.

Assign that class to the new record.

Alternatively (or in addition), it may be desirable to tweak the method so that it answers the question: What is an estimated probability of belonging to the class of interest? instead of Which class is the most probable? Obtaining class probabilities allows using a sliding cutoff to classify a record as belonging to class *i*, even if *i* is not the most probable class for that record. This approach is useful when there is a specific class of interest that we are interested in identifying, ...

Start Free Trial

No credit card required