Implementing WGAN
Now that we have a basic idea of how the Wasserstein GAN works, let's implement it. Once again, we'll use the DCGAN blueprint and omit the repetitive code snippets so that we can focus on the differences. The build_generator and build_critic functions instantiate the generator and the critic, respectively. For the sake of simplicity, the two networks contain only fully connected layers. All the hidden layers have LeakyReLU activations. Following the paper's guidelines, the generator has Tanh output activation and the critic has a single scalar output (no sigmoid activation, though). Next, let's implement the train method since it contains some WGAN specifics. We'll start with the method's declaration and the initialization ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Read now
Unlock full access