Chapter 15. Eigendecomposition and SVD Applications
Eigendecomposition and SVD are gems that linear algebra has bestowed upon modern human civilization. Their importance in modern applied mathematics cannot be understated, and their applications are uncountable and spread across myriad disciplines.
In this chapter, I will highlight three applications that you are likely to come across in data science and related fields. My main goal is to show you that seemingly complicated data science and machine learning techniques are actually quite sensible and easily understood, once you’ve learned the linear algebra topics in this book.
PCA Using Eigendecomposition and SVD
The purpose of PCA is to find a set of basis vectors for a dataset that point in the direction that maximizes covariation across the variables.
Imagine that an N-D dataset exists in an N-D space, with each data point being a coordinate in that space. This is sensible when you think about storing the data in a matrix with N observations (each row is an observation) of M features (each column is a feature, also called variable or measurement); the data live in and comprise N vectors or coordinates.
An example in 2D is shown in Figure 15-1. The left-side panel shows the data in its original data space, in which each variable provides a basis vector for the data. Clearly the two variables (the x- and y-axes) are related to each other, and clearly there is a direction in the data that captures that relation better ...
Get Practical Linear Algebra for Data Science now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.