October 2017
Intermediate to advanced
1159 pages
26h 10m
English
In this section, we will discuss in detail the issues faced by neural networks, which will become the stepping stone for building deep learning networks.
One of the major issues with neural networks is the problem of "vanishing gradient" (References [8]). We will try to give a simple explanation of the issue rather than exploring the mathematical derivations in depth. We will choose the sigmoid activation function and a two-layer neural network, as shown in the following figure, to demonstrate the issue:

Figure 5: Vanishing Gradient issue.
As we saw in the activation ...