December 2018
Beginner to intermediate
684 pages
21h 9m
English
Next, we'll look at the alternative computation using Singular Value Decomposition (SVD). This algorithm is slower when the number of observations is greater than the number of features (the typical case), but yields better numerical stability, especially when some of the features are strongly correlated (often the reason to use PCA in the first place).
SVD generalizes the eigendecomposition that we just applied to the square and symmetric covariance matrix to the more general case of m x n rectangular matrices. It has the form shown at the center of the following diagram. The diagonal values of Σ are the singular values, and the transpose of V* contains the principal components as column vectors: