## Book description

Developed from celebrated Harvard statistics lectures, Introduction to Probability provides essential language and tools for understanding statistics, randomness, and uncertainty. The book explores a wide variety of applications and examples, ranging from coincidences and paradoxes to Google PageRank and Markov chain Monte Carlo (MCMC). Additional application areas explored include genetics, medicine, computer science, and information theory. The print book version includes a code that provides free access to an eBook version.

The authors present the material in an accessible style and motivate concepts using real-world examples. Throughout, they use stories to uncover connections between the fundamental distributions in statistics and conditioning to reduce complicated problems to manageable pieces.

The book includes many intuitive explanations, diagrams, and practice problems. Each chapter ends with a section showing how to perform relevant simulations and calculations in R, a free statistical software environment.

1. Preliminaries
2. Preface
3. Chapter 1 Probability and counting
1. 1.1 Why study probability?
2. 1.2 Sample spaces and Pebble World
3. 1.3 Naive definition of probability
4. 1.4 How to count
5. 1.5 Story proofs
6. 1.6 Non-naive definition of probability
7. 1.7 Recap
8. 1.8 R
9. 1.9 Exercises
4. Chapter 2 Conditional probability
1. 2.1 The importance of thinking conditionally
2. 2.2 Definition and intuition
3. 2.3 Bayesâ rule and the law of total probability
4. 2.4 Conditional probabilities are probabilities
5. 2.5 Independence of events
6. 2.6 Coherency of Bayesâ rule
7. 2.7 Conditioning as a problem-solving tool
9. 2.9 Recap
10. 2.10 R
11. 2.11 Exercises
5. Chapter 3 Random variables and their distributions
1. 3.1 Random variables
2. 3.2 Distributions and probability mass functions
3. 3.3 Bernoulli and Binomial
4. 3.4 Hypergeometric
5. 3.5 Discrete Uniform
6. 3.6 Cumulative distribution functions
7. 3.7 Functions of random variables
8. 3.8 Independence of r.v.s
9. 3.9 Connections between Binomial and Hypergeometric
10. 3.10 Recap
11. 3.11 R
12. 3.12 Exercises
6. Chapter 4 Expectation
1. 4.1 Definition of expectation
2. 4.2 Linearity of expectation
3. 4.3 Geometric and Negative Binomial
4. 4.4 Indicator r.v.s and the fundamental bridge
5. 4.5 Law of the unconscious statistician (LOTUS)
6. 4.6 Variance
7. 4.7 Poisson
8. 4.8 Connections between Poisson and Binomial
9. 4.9 *Using probability and expectation to prove existence
10. 4.10 Recap
11. 4.11 R
12. 4.12 Exercises
7. Chapter 5 Continuous random variables
1. 5.1 Probability density functions
2. 5.2 Uniform
3. 5.3 Universality of the Uniform
4. 5.4 Normal
5. 5.5 Exponential
6. 5.6 Poisson processes
7. 5.7 Symmetry of i.i.d. continuous r.v.s
8. 5.8 Recap
9. 5.9 R
10. 5.10 Exercises
8. Chapter 6 Moments
1. 6.1 Summaries of a distribution
2. 6.2 Interpreting moments
3. 6.3 Sample moments
4. 6.4 Moment generating functions
5. 6.5 Generating moments with MGFs
6. 6.6 Sums of independent r.v.s via MGFs
7. 6.7 *Probability generating functions
8. 6.8 Recap
9. 6.9 R
10. 6.10 Exercises
9. Chapter 7 Joint distributions
1. 7.1 Joint, marginal, and conditional
2. 7.2 2D LOTUS
3. 7.3 Covariance and correlation
4. 7.4 Multinomial
5. 7.5 Multivariate Normal
6. 7.6 Recap
7. 7.7 R
8. 7.8 Exercises
10. Chapter 8 Transformations
1. 8.1 Change of variables
2. 8.2 Convolutions
3. 8.3 Beta
4. 8.4 Gamma
5. 8.5 Beta-Gamma connections
6. 8.6 Order statistics
7. 8.7 Recap
8. 8.8 R
9. 8.9 Exercises
11. Chapter 9 Conditional expectation
1. 9.1 Conditional expectation given an event
2. 9.2 Conditional expectation given an r.v.
3. 9.3 Properties of conditional expectation
4. 9.4 *Geometric interpretation of conditional expectation
5. 9.5 Conditional variance
6. 9.6 Adam and Eve examples
7. 9.7 Recap
8. 9.8 R
9. 9.9 Exercises
12. Chapter 10 Inequalities and limit theorems
1. 10.1 Inequalities
2. 10.2 Law of large numbers
3. 10.3 Central limit theorem
4. 10.4 Chi-Square and Student-t
5. 10.5 Recap
6. 10.6 R
7. 10.7 Exercises
13. Chapter 11 Markov chains
1. 11.1 Markov property and transition matrix
2. 11.2 Classification of states
3. 11.3 Stationary distribution
4. 11.4 Reversibility
5. 11.5 Recap
6. 11.6 R
7. 11.7 Exercises
14. Chapter 12 Markov chain Monte Carlo
1. 12.1 Metropolis-Hastings
2. 12.2 Gibbs sampling
3. 12.3 Recap
4. 12.4 R
5. 12.5 Exercises
15. Chapter 13 Poisson processes
1. 13.1 Poisson processes in one dimension
2. 13.2 Conditioning, superposition, thinning
3. 13.3 Poisson processes in multiple dimensions
4. 13.4 Recap
5. 13.5 R
6. 13.6 Exercises
16. A Math
1. A.1 Sets
2. A.2 Functions
3. A.3 Matrices
4. A.4 Difference equations
5. A.5 Differential equations
6. A.6 Partial derivatives
7. A.7 Multiple integrals
8. A.8 Sums
9. A.9 Pattern recognition
10. A.10 Common sense and checking answers
17. B R
18. C Table of distributions
19. Bibliography

## Product information

• Title: Introduction to Probability
• Author(s): Joseph K. Blitzstein, Jessica Hwang
• Release date: September 2015
• Publisher(s): CRC Press
• ISBN: 9781498759762