The law of large numbers is a theorem in probability theory that describes the average of a large number of random variables. The law states that the average of a large number of random variables converges to the expected value in some sense under certain conditions. In this chapter, we introduce some laws of large numbers.

**Definition 16.1** (Almost Surely Convergence). Let *X, X*_{1}, *X*_{2},… be random variables on a probability space (Ω, , *P*). The sequence {*X*_{n}}_{n≥1} is said to converge to *X* almost surely if and only if it converges to *X* almost everywhere; that is, there is a set *A* such that *X*_{n} → *X* on *A* and *P*(*A*^{c}) = *P*(Ω\*A*) = 0.

**Definition 16.2** (Convergence in Probability). Let *X*_{1}, *X*_{2},…, be a sequence of random variables on (Ω, , *P*). The sequence *X*_{i} is said to be convergent to a random variable *X* on (Ω, , *P*) if for every > 0, we have

**Definition ...**

Start Free Trial

No credit card required