August 2018
Intermediate to advanced
272 pages
7h 2m
English
So now that we have a generator, a discriminator, and our loss function, all that is left is to train! We will give a sketch idea of how to do this in TensorFlow, because there is nothing fancy in this part; it is just piecing together the stuff from the previous section, along with loading and feeding MNIST images, as we did earlier.
First, set up two solvers: one for the discriminator and one for the generator. A smaller value of beta1 for the AdamOptimizer is used as it has been shown to help GAN train to converge:
discriminator_solver = tf.train.AdamOptimizer(learning_rate=0.001, beta1=0.5)generator_solver = tf.train.AdamOptimizer(learning_rate=0.001, beta1=0.5)
Next, create a random noise vector; this can be done with ...