Contrastive pessimistic likelihood estimation

As explained at the beginning of this chapter, in many real life problems, it's cheaper to retrieve unlabeled samples, rather than correctly labeled ones. For this reason, many researchers worked to find out the best strategies to carry out a semi-supervised classification that could outperform the supervised counterpart. The idea is to train a classifier with a few labeled samples and then improve its accuracy after adding weighted unlabeled samples. One of the best results is the Contrastive Pessimistic Likelihood Estimation (CPLE) algorithm, proposed by M. Loog (in Loog M., Contrastive Pessimistic Likelihood Estimation for Semi-Supervised Classification, arXiv:1503.00269).

Before explaining ...

Get Mastering Machine Learning Algorithms now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.