Chapter 9. Orthogonal Matrices and QR Decomposition

You will learn five major decompositions in this book: orthogonal vector decomposition, QR decomposition, LU decomposition, eigendecomposition, and singular value decomposition. Those are not the only decompositions in linear algebra, but they are the most important ones for data science and machine learning.

In this chapter, you will learn QR. And along the way, you’ll learn a new special type of matrix (orthogonal). QR decomposition is a workhorse that powers applications including the matrix inverse, least squares model fitting, and eigendecomposition. Therefore, understanding and gaining familiarity with QR decomposition will help you level up your linear algebra skills.

Orthogonal Matrices

I will begin by introducing you to orthogonal matrices. An orthogonal matrix is a special matrix that is important for several decompositions, including QR, eigendecomposition, and singular value decomposition. The letter 𝐐 is often used to indicate orthogonal matrices. Orthogonal matrices have two properties:

Orthogonal columns

All columns are pair-wise orthogonal.

Unit-norm columns

The norm (geometric length) of each column is exactly 1.

We can translate those two properties into a mathematical expression (remember that 𝐚 , 𝐛 is an alternative notation for the dot product):

𝐪 i , 𝐪 j = 0 , if i j 1 , if i = j

What does that mean? It means that the dot product of a column with itself ...

Get Practical Linear Algebra for Data Science now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.