Generative Gaussian mixture

A Gaussian mixture model is primarily a generative one. This means that the goal of the training process is to optimize the parameters, in order to maximize the likelihood that the model has generated the dataset. If the assumptions are correct and X has been sampled from a specific data-generating process, the final approximation must be able to generate all of the other potential samples. In other words, we are assuming that xi ∈ X is IDD, and xi ∼ pdata; hence, when the optimal approximation, p ≈ pdata, has been found, all of the samples, xj, whose probability under p is high are also very likely to be generated by pdata.

In this example, we want to employ a Gaussian mixture model in a semi-supervised scenario. ...

Get Hands-On Unsupervised Learning with Python now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.