O'Reilly logo

Machine Learning by Eihab Mohammed Bashier, Muhammad Badruddin Khan, Mohssen Mohammed

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Chapter 4

Naïve Bayesian Classification

4.1 Introduction

Naïve Bayesian classifiers [1] are simple probabilistic classifiers with their foundation on application of Bayes’ theorem with the assumption of strong (naïve) independence among the features. The following equation [2] states Bayes’ theorem in mathematical terms:

P(A|B)=P(A)P(B|A)P(B)

where:

A and B are events

P(A) and P(B) are the prior probabilities of A and B without regard to each other

P(A|B), also called posterior probability, is the probability of observing event A given that B is true

P(B|A), also called likelihood, is the probability of observing event B given that A is true

Suppose that vector X = (x1, x2, … xn) is an instance (with n independent features) to be classified ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required