Let's look at the code for it and then explore the important features:
netG.zero_grad()labelv = Variable(label.fill_(real_label)) # fake labels are real for generator costoutput = netD(fake)errG = criterion(output, labelv)errG.backward()optimizerG.step()
It looks similar to what we did while we trained the discriminator on fake images, except for some key differences. We are passing the same fake images created by the generator, but this time we are not detaching it from the graph that produced it, because we want the generator to be trained. We calculate the loss (errG) and calculate the gradients. Then we call the generator optimizer, as we want only the generator to be trained, and we repeat this entire ...