May 2018
Beginner
490 pages
13h 16m
English
The activation function of this model the is ReLU function (see Chapter 9, Getting Your Neurons to Work).
When we peek into its structure, once again we see that it is not just an activation function in a sequential process. Its output goes to the dropout and training nodes with complex intermediate calculations as shown in the following description of the activation function:

TensorBoard provides us with yet another visual representation to control the distribution of preactivation and, in this case, activations. The DISTRIBUTIONS section enables us to detect anomalies in ...
Read now
Unlock full access