Chapter 10. Row Reduction and LU Decomposition
Now we move to LU decomposition. LU, like QR, is one of the computational backbones underlying data-science algorithms, including least squares model fitting and the matrix inverse. This chapter is, therefore, pivotal to your linear algebra education.
The thing about LU decomposition is you cannot simply learn it immediately. Instead, you first need to learn about systems of equations, row reduction, and Gaussian elimination. And in the course of learning those topics, you’ll also learn about echelon matrices and permutation matrices. Oh yes, dear reader, this will be an exciting and action-packed chapter.
Systems of Equations
To understand LU decomposition and its applications, you need to understand row reduction and Gaussian elimination. And to understand those topics, you need to understand how to manipulate equations, convert them into a matrix equation, and solve that matrix equation using row reduction.
Let’s start with a “system” of one equation:
As I’m sure you learned in school, you can do various mathematical manipulations to the equation—as long as you do the same thing to both sides of the equation. That means that the following equation is not the same as the previous one, but they are related to each other by simple manipulations. More importantly, any solution to one equation is a solution to the other:
Now let’s move to a system of two equations:
Get Practical Linear Algebra for Data Science now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.