This chapter is the follow-up to the previous one concerning sparsity-aware learning. The emphasis now is on the algorithmic front. Greedy, iterative thresholding and convex optimization algorithms are presented and discussed, both for batch as well as online learning. Extensions of the ℓ1 norm regularization are introduced, such as group sparse modeling, structured sparsity, total variation. The issue of analysis versus synthesis sparse modeling is presented together with the notion of co-sparsity. Finally, a case study of sparse modeling for time-frequency analysis in the context of Gabor frames is demonstrated.
Get Machine Learning now with O’Reilly online learning.
O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.