December 2018
Beginner to intermediate
684 pages
21h 9m
English
Practical applications of Bayes' rule to exactly compute posterior probabilities are quite limited because the computation of the evidence term in the denominator is quite challenging. The evidence reflects the probability of the observed data over all possible parameter values. It is also called the marginal likelihood because it requires marginalizing out the parameters' distribution by adding or integrating over their distribution. This is generally only possible in simple cases with a small number of discrete parameters that assume very few values.
Maximum a posteriori probability (MAP) estimation leverages that the evidence is a constant factor that scales the posterior to meet the requirements ...