May 2018
Beginner
490 pages
13h 16m
English
To make sense out of the covariance of the covariance matrix, the eigenvector will point to the direction in which the covariances are going. The eigenvalues will express the magnitude or importance of a given feature.
To sum it up, an eigenvector will provide the direction and the eigenvalue the importance for the covariance matrix a. With those results, we will be able to represent the PCA with TensorBoard projector in a multi-dimensional space.
Let w be an eigenvalue(s) of a. An eigenvalue(s) must satisfy the following equation:

There must exist a vector v for which dot(a,v) is the same as w*v:
NumPy will do ...
Read now
Unlock full access