October 2018
Intermediate to advanced
368 pages
9h 20m
English
Since the introduction of the Generative Adversarial Networks (GANs) in 2014[1], its popularity has rapidly increased. GANs have proved to be a useful generative model that can synthesize new data that look real. Many of the research papers in deep learning that followed, proposed measures to address the difficulties and limitations of the original GAN.
As we discussed in previous chapters, GANs can be notoriously difficult to train and prone to mode collapse. Mode collapse is a situation where the generator is producing outputs that look the same even though the loss functions are already optimized. In the context of MNIST digits, with mode collapse, the generator may only be producing digits 4 and 9 since they look similar. ...