2 Review of Probability Theory
In this chapter, we first review the concepts of probability, typical probability distributions relevant to statistical mechanics, and a few quantities that characterize each probability distribution. Then, we learn the concept of uncertainty, since it is related to the entropy discussed in Chapter 4.
2.1 Probability
As is the case with most probability theory books, we start with considering the tossing of a coin. Its outcome is either “head” or “tail.” We do not know the result before tossing the coin. Tossing a coin is a trial, since we know the result a posteriori.
We call a specific outcome of the trial an “event.” There are two or more events for a trial, and we index them by i. In tossing a coin, i = h (head) and t (tail). When we roll a die, i = 1, 2, 3, 4, 5, and 6.
We can repeat the trials for a total N times. Event i occurs ni times. In tossing a coin N times, nh + nt = N. The ratios, nh/N and nt/N, are called relative frequencies. The sum of all possible frequencies is equal to 1. In rolling a die N times, the relative frequencies are n1/N, n2/N,…, and n6/N, and their sum is 1.
Consider starting a series of trials and updating the running sums, nh and nt. At an early stage, the relative frequencies, nh/N and nt/N, vary a lot as the result of a new trial being added. With an increasing N, the relative frequencies approach their respective constant numbers. If the coin is a uniform disc, then nh/N will approach ½ as N → ∞ . Likewise, ...