Adamczak, R., Litvak, A., Pajor, A., Tomczak-Jaegermann, N., 2011. Restricted
isometry property of matrices with independent columns and neighborly polytopes
by random sampling. Constructive Approximation 34 (1), 61–88.
Aharon, M., Elad, M., Bruckstein, A., 2006a. K-SVD: An algorithm for designing
overcomplete dictionaries for sparse representation. IEEE Transactions on Signal
Processing 54 (11), 4311–4322.
Aharon, M., Elad, M., Bruckstein, A., 2006b. On the uniqueness of overcomplete
dictionaries, and a practical way to retrieve them. Linear algebra and its applica-
tions 416 (1), 48–67.
Antoniadis, A., Fan, J., 2001. Regularization of wavelet approximations. Journal of
the American Statistical Association 96 (455).
Asadi, N. B., Rish, I., Scheinberg, K., Kanevsky, D., Ramabhadran, B., 2009. MAP
approach to learning sparse Gaussian Markov networks. In: Proc. of the IEEE
International Conference on Acoustics, Speech and Sig nal Processing (ICASSP).
pp. 1721–1724.
Asif, M., Romberg, J., 2010. On the Lasso and Dantzig selector equivalence. In:
Proc. of the 44th Annual Conference on Information Sciences and Systems (CISS).
IEEE, pp. 1–6.
Atia, G., Saligrama, V., March 2012. Boolean compressed sensing and noisy group
testing. IEEE Transactions on Information Theory 58 (3), 1880–1901.
Ausiello, G., Protasi, M., Marchetti-Spaccamela, A., Gambosi, G., Crescenzi, P.,
Kann, V., 1999. Complexity and Approximation: Combinatorial Optimization
Problems and Their Approximability Properties. Springer-Verlag New York.
Bach, F., 2008a. Bolasso: Model consistent Lasso estimation through the bootstrap.
In: Proc. of the 25th International Conference on Machine Learning (ICML). pp.
Bach, F., 2008b. Consistency of the group Lasso and multiple kernel learning. Journal
of Machine Learning Research 9, 1179–1225.
Bach, F., June 2008c. Consistency of trace norm minimization. Journal of Machine
Learning Research 9, 1019–1048.
204 Bibliography
Bach, F., 2010. Self-concordant analysis for logistic regression. Electronic Journal of
Statistics 4, 384–414.
Bach, F., Jenatton, R., Mairal, J., Obozinski, G., 2012. Optimization with sparsity-
inducing penalties. Foundations and Trends in Machine Learning 4 (1), 1–106.
Bach, F., Lanckriet, G., Jordan, M., 2004. Multiple kernel learning, conic duality,
and the SMO algorithm. In: Proc. of the Twenty-first International Conference on
Machine Learning (ICML).
Bach, F., Mairal, J., Ponce, J., 2008. Convex sparse matrix factorizations. arXiv
preprint arXiv:0812.1869.
Bakin, S., 1999. Adaptive regression and model selection in data mining problems.
Ph.D. thesis, Australian National University, Canberra, Australia.
Balasubramanian, K., Yu, K., Lebanon, G., 2013. Smooth sparse coding via marginal
regression for learning sparse representations. In: Proc. of the International Con-
ference o n Machine Learning (ICML). pp. 289–297.
Baliki, M., Geha, P., Apkarian, A., 2009. Parsing pain perception between nocicep-
tive representation and magnitude estimation. Journal of Neurophysiology 101,
Baliki, M., Geha, P., Apkarian, A., Chialvo, D., 2008. Beyond feeling: Chronic pain
hurts the brain, disrupting the default-mode network dynamics. The Journal of
Neuroscience 28 (6), 1398–1403.
Banerjee, A., Merugu, S., Dhillon, I., Ghosh, J., April 2004. Clustering with Breg-
man divergences. In: Proc. of the Fourth SIAM International Conference on Data
Mining. pp. 234–245.
Banerjee, A., Merugu, S., Dhillon, I. S., Ghosh, J., October 2005. Clustering with
Bregman divergences. Journal of Machine Learning Research 6, 1705–1749.
Banerjee, O., El Ghaoui, L., d’Aspremont, A., March 2008. Model selection through
sparse maximum likelihood estimation for multivariate Gaussian or binary data.
Journal of Machine Learning Research 9, 485–516.
Banerjee, O., Ghaoui, L. E., d’Aspremont, A., Natsoulis, G., 2006. Convex optimiza-
tion techniques for fitting sparse Gaussian graphical models. In: Proc. of the 23rd
International Conference on Machine Learning (ICML). pp. 89–96.
Baraniuk, R., Davenport, M., DeVore, R., Wakin, M., 2008. A simple proof of the
restricted isometry property for random matrices. Constructive Approximation
28 (3), 253–263.
Beck, A., Teboulle, M., 2009. A fast iterative shrinkage-thresholding algorithm for
linear inverse problems. SIAM J. Imaging Sciences 2 (1), 183–202.

Get Sparse Modeling now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.