Chapter 6. Matrices, Part 2
Matrix multiplication is one of the most wonderful gifts that mathematicians have bestowed upon us. But to move from elementary to advanced linear algebra—and then to understand and develop data science algorithms—you need to do more than just multiply matrices.
We begin this chapter with discussions of matrix norms and matrix spaces. Matrix norms are essentially an extension of vector norms, and matrix spaces are essentially an extension of vector subspaces (which in turn are nothing more than linear weighted combinations). So you already have the necessary background knowledge for this chapter.
Concepts like linear independence, rank, and determinant will allow you to transition from understanding elementary concepts like transpose and multiplication to understanding advanced topics like inverse, eigenvalues, and singular values. And those advanced topics unlock the power of linear algebra for applications in data science. Therefore, this chapter is a waypoint in your transformation from linear algebra newbie to linear algebra knowbie.1
Matrices seem like such simple things—just a spreadsheet of numbers. But you’ve already seen in the previous chapters that there is more to matrices than meets the eye. So, take a deep and calming breath and dive right in.
Matrix Norms
You learned about vector norms in Chapter 2: the norm of a vector is its Euclidean geometric length, which is computed as the square root of the sum of the squared vector elements. ...
Get Practical Linear Algebra for Data Science now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.