Chapter 9. Evidence and Probabilities
Fundamental concepts: Explicit evidence combination with Bayes’ Rule; Probabilistic reasoning via assumptions of conditional independence.
Exemplary techniques: Naive Bayes classification; Evidence lift.
So far we have examined several different methods for using data to help draw conclusions about some unknown quantity of a data instance, such as its classification. Let’s now examine a different way of looking at drawing such conclusions. We could think about the things that we know about a data instance as evidence for or against different values for the target. The things that we know about the data instance are represented as the features of the instance. If we knew the strength of the evidence given by each feature, we could apply principled methods for combining evidence probabilistically to reach a conclusion as to the value for the target. We will determine the strength of any particular piece of evidence from the training data.
Example: Targeting Online Consumers With Advertisements
To illustrate, let’s consider another business application of classification: targeting online display advertisements to consumers, based on what webpages they have visited in the past. As consumers, we have become used to getting a vast amount of information and services on the Web seemingly for free. Of course, the “for free” part is very often due to the existence or promise of revenue from online advertising, similar to how broadcast television is “free.” ...
Get Data Science for Business now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.