Now that we have a basic idea of how the Wasserstein GAN works, let's implement it. Once again, we'll use the DCGAN blueprint and omit the repetitive code snippets so that we can focus on the differences. The build_generator and build_critic functions instantiate the generator and the critic, respectively. For the sake of simplicity, the two networks contain only fully connected layers. All the hidden layers have LeakyReLU activations. Following the paper's guidelines, the generator has Tanh output activation and the critic has a single scalar output (no sigmoid activation, though). Next, let's implement the train method since it contains some WGAN specifics. We'll start with the method's declaration and the initialization ...
Implementing WGAN
Get Advanced Deep Learning with Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.