Chapter 2

Random Variables

2.1 Introduction

We saw in the previous chapter that many random experiments have numerical outcomes. Even if the outcome itself is not numerical, such as the case is Example 1.4, where a coin is flipped twice, we often consider events that can be described in terms of numbers, for example, {the number of heads equals 2}. It would be convenient to have some mathematical notation to avoid the need to spell out all events in words. For example, instead of writing {the number of heads equals 1} and {the number of heads equals 2}, we could start by denoting the number of heads by X and consider the events {X = 1} and {X = 2}. The quantity X is then something whose value is not known before the experiment but becomes known after.

Definition 2.1. A random variable is a real-valued variable that gets its value from a random experiment.

There is a more formal definition that defines a random variable as a real-valued function on the sample space. If X denotes the number of heads in two coin flips, we would thus, for example, have X(HH) = 2. In a more advanced treatment of probability theory, this formal definition is necessary, but for our purposes, Definition 2.1 is enough.

A random variable X is thus something that does not have a value until after the experiment. Before the experiment, we can only describe the set of possible values, that is, the range of X and the associated probabilities. Let us look at a simple example.

Example 2.1. Flip a coin ...

Get Probability, Statistics, and Stochastic Processes, 2nd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.