4.1 INTRODUCTION

A goal of this chapter is to extend the definition of a random variable to two or more joint random variables and to investigate how they are related. We first describe approximations of some random variables covered in Chapter 3. These not only provide insight about how these random variables are similar, but the approximations can also be used to perform computations that would otherwise be more difficult using the probability density function (pdf) of the original random variable. We then describe joint distributions that are used to derive probabilities of events for several random variables, and can be represented by a random vector. This leads to conditional distributions where probabilities are computed based on a conditioning event or a conditioning random variable. An example of the conditional pdf fY|X(y|x) for a specific value x is shown in Figure 4.1(a). Conditioning is important because it describes the extent to which two random variables are correlated and thus their “mutual information.”

We also consider independence, which is a property used in Chapters 6 and 7 to characterize random processes. Techniques for deriving the pdf and cumulative distribution function (cdf) of a random variable that is a transformation of another random variable are presented along with several examples. For example, a specific nonlinear transformation of an exponential random variable leads to the logistic random variable shown in Figure 4.1(b). The transformations ...

Get Probability, Random Variables, and Random Processes: Theory and Signal Processing Applications now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.