The training loop

We have previously had the luxury of calling .fit() on our model and letting Keras handle the painful process of breaking the data apart into minibatches and training for us.

Unfortunately, because we need to perform the separate updates for the discriminator and the stacked model together for a single batch we're going to have to do things the old fashioned way, with a few loops. This is how things used to be done all the time, so while it's perhaps a little more work, it does admittedly leave me feeling nostalgic. The following code illustrates the training technique:

num_examples = X_train.shape[0]num_batches = int(num_examples / float(batch_size))half_batch = int(batch_size / 2)for epoch in range(epochs + 1):  for batch ...

Get Deep Learning Quick Reference now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.