# Chapter 3Multivariate random variables

In the previous chapters, the objects of our attention were univariate random variables. When two random variables have been discussed, the silent assumption was that they were independent. Indeed, the concept of independence is easy to understand although no formal definition has been given yet. The goal of this chapter is to give a formal definition of independence, but more importantly describe how dependence can be rigorously defined and studied using the joint and conditional distributions. The conditional mean and bivariate normal distribution are studied in detail because they are fundamental concepts for describing the relationship between random variables. Multivariate distributions of random vectors are conveniently handled using vector and matrix algebra. Thus, linear algebra techniques and concepts such as matrix algebra manipulations, eigenvectors, and eigenvalues will be used in this chapter. Multiple examples of multivariate continuous distributions can be found in an authoritative book by Kotz et al. [67].

## 3.1 Joint cdf and density

When two random variables are studied, their

(multivariate) distribution should be defined. We shall use the term *joint*

when only two random variables are involved, Knowing individual (marginal) distributions is not enough to completely specify the joint distribution because the value ...*bivariate*

Get *Advanced Statistics with Applications in R* now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.