4 Manifold Learning, Near-isometric Embeddings, Compressed Sensing, Johnson–Lindenstrauss and Some Applications Related to the near Whitney extension problem

Our work related to the topics in this chapter are the following [3, 25, 32, 33, 44, 47, 116, 117](diffusion, hyperspectral imaging, gene clustering, partial differential equations and random matrices), [27, 3942](extensions), [53](signal processing and compressed sensing), [95, 70](shortest paths and power weighted clustering).

4.1 Manifold and Deep Learning Via c-distorted Diffeomorphisms

One of the main challenges in high dimensional data analysis, networks, artificial intelligence, neuroscience, optimal transport and many other related areas of research is dealing with the exponential growth of the computational and sample complexity of several needed generic inference tasks as a function of dimension, a phenomenon termed “the curse of dimensionality”.

One intuition that has been put forward to lessen or even obviate the impact of this curse is a manifold hypothesis that the data tends to lie on or near a low dimensional submanifold of the ambient space. See for example Figure 4.1. Algorithms and analyses that are based on this hypothesis constitute an enormous area of research of learning and deep learning theory known as manifold learning. Deep relationships between the manifold hypothesis and extension problems are now known. See also Chapter 22. It is an interesting problem to study the connections between the ...

Get Near Extensions and Alignment of Data in R(superscript)n now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.