Chapter 9. Orthogonal Matrices and QR Decomposition
You will learn five major decompositions in this book: orthogonal vector decomposition, QR decomposition, LU decomposition, eigendecomposition, and singular value decomposition. Those are not the only decompositions in linear algebra, but they are the most important ones for data science and machine learning.
In this chapter, you will learn QR. And along the way, you’ll learn a new special type of matrix (orthogonal). QR decomposition is a workhorse that powers applications including the matrix inverse, least squares model fitting, and eigendecomposition. Therefore, understanding and gaining familiarity with QR decomposition will help you level up your linear algebra skills.
Orthogonal Matrices
I will begin by introducing you to orthogonal matrices. An orthogonal matrix is a special matrix that is important for several decompositions, including QR, eigendecomposition, and singular value decomposition. The letter is often used to indicate orthogonal matrices. Orthogonal matrices have two properties:
- Orthogonal columns
-
All columns are pair-wise orthogonal.
- Unit-norm columns
-
The norm (geometric length) of each column is exactly 1.
We can translate those two properties into a mathematical expression (remember that is an alternative notation for the dot product):
What does that mean? It means that the dot product of a column with itself ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access