Chapter 7. Minimum, Maximum, and Mixture
In the previous chapter we computed distributions of sums. In this chapter, we’ll compute distributions of minimums and maximums, and use them to solve both Forward and Inverse Problems.
Then we’ll look at distributions that are mixtures of other distributions, which will turn out to be particularly useful for making predictions.
But we’ll start with a powerful tool for working with distributions, the cumulative distribution function.
Cumulative Distribution Functions
So far we have been using probability mass functions to represent distributions. A useful alternative is the cumulative distribution function, or CDF.
As an example, I’ll use the posterior distribution from the Euro Problem, which we computed in “Bayesian Estimation”.
Here’s the uniform prior we started with:
import
numpy
as
np
from
empiricaldist
import
Pmf
hypos
=
np
.
linspace
(
0
,
1
,
101
)
pmf
=
Pmf
(
1
,
hypos
)
data
=
140
,
250
And here’s the update:
from
scipy.stats
import
binom
def
update_binomial
(
pmf
,
data
):
"""Update pmf using the binomial distribution."""
k
,
n
=
data
xs
=
pmf
.
qs
likelihood
=
binom
.
pmf
(
k
,
n
,
xs
)
pmf
*=
likelihood
pmf
.
normalize
()
update_binomial
(
pmf
,
data
)
The CDF is the cumulative sum of the PMF, so we can compute it like this:
cumulative
=
pmf
.
cumsum
()
Here’s what it looks like, along with the PMF:
The range of the CDF is always from 0 to 1, in contrast ...
Get Think Bayes, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.