Random projection
PCA tries to find some structure in data and use it for reducing the dimensionality; it finds such a basis in which most of the original variance is preserved. However, there is an alternative approach instead of trying to learn the basis, just generate it randomly and then project the original data on it.
Surprisingly, this simple idea works quite well in practice. The reason for that is, this transformation preserves distances. What this means is that if we have two objects that are close to each other in the original space, then, when we apply the projection, they still remain close. Likewise, if the objects are far away from each other, then they will remain far in the new reduced space.
Smile already has implementation ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access