April 2017
Intermediate to advanced
318 pages
7h 40m
English
Batch normalization (for more information, refer to https://www.colwiz.com/cite-in-google-docs/cid=f20f9683aaf69ce) is a way to accelerate learning and generally achieve better accuracy. We will look at examples of usage in Chapter 4, Generative Adversarial Networks and WaveNet, when we discuss GANs. Here is the prototype with a definition of the parameters:
keras.layers.normalization.BatchNormalization(axis=-1, momentum=0.99, epsilon=0.001, center=True, scale=True, beta_initializer='zeros', gamma_initializer='ones', moving_mean_initializer='zeros', moving_variance_initializer='ones', beta_regularizer=None, gamma_regularizer=None, beta_constraint=None, gamma_constraint=None)