9 Generative adversarial networks
This chapter covers
- Working with generative models for fully connected and convolutional networks
- Encoding concepts using latent vectors
- Training two networks that cooperate
- Manipulating generation using a conditional model
- Manipulating generation with vector arithmetic
Most of what we have learned thus far has been a one-to-one mapping. Every input has one correct class/output. The dog can only be a “dog”; the sentence is only “positive” or “negative.” But we can also encounter one-to-many problems where there is more than one possible answer. For example, we may have the concept of “seven” as input and need to create several different kinds of pictures of the digit 7. Or, to colorize an old black-and-white ...
Get Inside Deep Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.