Let **A** and **B** be two matrices with the same dimensions. Notation such as **A** ≥ 0, **A** > 0 or **A** ≤ **B** has to be understood component-wise. In this chapter, we are dealing with square matrices **A** ≥ 0, i.e. with non-negative entries. There is a rich theory about their spectra, eigenspaces and powers that turns out to be of particular interest when considering adjacency matrices of graphs. This theory also has many applications ranging from probability theory and Markov chains^{1} to dynamical systems (see, for instance, [BRI 02]). Perron’s theorem is at the core of Google’s PageRank algorithm discussed in the following chapter. For standard textbooks on matrix theory including discussions about Perron–Frobenius theory see, for instance, [HOR 13] or [SEN 06, GAN 59]. The reader will not find a proof of Perron’s theorem in this book.

## 9.1. Primitive graphs and Perron’s theorem

We will start with a quite specific property (more restrictive than strong connectedness) that has an important asymptotic consequence expressed by relation [9.3].

Let *t* ≥ 1 be an integer. A matrix **A** ∈ ^{t×t} with non-negative entries is *primitive* if there exists an integer *N* such that **A**^{N} is positive, i.e. all the entries in **A**^{N} are positive. Primitive matrices have strong properties known as Perron–Frobenius^{2} theory.

DEFINITION 9.1.– *A directed multigraph is ***primitive** if its adjacency matrix is primitive. ...