# Chapter 9. Orthogonal Matrices and QR Decomposition

You will learn five major decompositions in this book: orthogonal vector decomposition, QR decomposition, LU decomposition, eigendecomposition, and singular value decomposition. Those are not the only decompositions in linear algebra, but they are the most important ones for data science and machine learning.

In this chapter, you will learn QR. And along the way, you’ll learn a new special type of matrix (orthogonal). QR decomposition is a workhorse that powers applications including the matrix inverse, least squares model fitting, and eigendecomposition. Therefore, understanding and gaining familiarity with QR decomposition will help you level up your linear algebra skills.

# Orthogonal Matrices

I will begin by introducing you to orthogonal matrices. An orthogonal matrix is a special matrix that is important for several decompositions, including QR, eigendecomposition, and singular value decomposition. The letter $bold upper Q$ is often used to indicate orthogonal matrices. Orthogonal matrices have two properties:

Orthogonal columns

All columns are pair-wise orthogonal.

Unit-norm columns

The norm (geometric length) of each column is exactly 1.

We can translate those two properties into a mathematical expression (remember that $mathematical left-angle bold a comma bold b mathematical right-angle$ is an alternative notation for the dot product):

$mathematical left-angle bold q Subscript i Baseline comma bold q Subscript j Baseline mathematical right-angle equals StartLayout Enlarged left-brace 1st Row 0 comma if i not-equals j 2nd Row 1 comma if i equals j EndLayout$

What does that mean? It means that the dot product of a column with itself ...

Get Practical Linear Algebra for Data Science now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.