11.2 Orthogonal Projection-Based EEAs

In this section, we further show that PPI, VCA, and ATGP-EEA are essentially the same type of algorithms that rather appear in different forms. All of these three EEAs use the OP derived from the principle of orthogonality to find endmembers. OP is one of most widely used concepts in statistical signal processing and plays a key role in mean squared error or least squares error-based approaches (Poor, 1994). It basically says that any new or innovations information must be orthogonal in the sense of mean squared error or least squares error to the information that is already known or the information that can be inferred from the data samples that were already processed. A best example of using OP is the orthogonal subspace projection (OSP) approach developed for linear spectral mixture analysis (Harsanyi and Chang, 1994; Chang, 2003a; Chang, 2005) as well as ATGP used for target detection and classification (Ren and Chang, 2003; Chang, 2003a). Interestingly, the first use of OP in endmember extraction is PPI, which assumes that the more likely the data sample to be an endmember, the better chance the data sample to be orthogonally projected at end points of a skewer where a skewer is a randomly generated unit vector. A further use of OP is exploited by VCA that selects a data sample vector with the maximal OP in the OSP-projected space as a potential endmember. Interestingly, PPI, ATGP, and VCA share the same idea of using OP to find endmembers, ...

Get Hyperspectral Data Processing: Algorithm Design and Analysis now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.