June 2019
Intermediate to advanced
308 pages
7h 21m
English
To allow a neural network to learn complex decision boundaries, we apply a non-linear activation function to some of its layers. Commonly used functions include Tanh, ReLU, softmax, and variants of these. More technically, each neuron receives a signal of the weighted sum of the synaptic weights and the activation values of the neurons that are connected as input. One of the most widely used functions for this purpose is the so-called sigmoid logistic function, which is defined as follows:

The domain of this function includes all real numbers, and the co-domain is (0, 1). This means that any value obtained as an output ...
Read now
Unlock full access