In Chapter 6, we examined the intricacies of artificial neurons. The theme of the current chapter is the natural extension of that: We cover how individual neural units are linked together to form artificial neural networks, including deep learning networks.
In our Shallow Net in Keras Jupyter notebook (a schematic of which is available in Figure 5.4), we crafted an artificial neural network with the following layers:
An input layer consisting of 784 neurons, one for each of the 784 pixels in an MNIST image
A hidden layer composed of 64 sigmoid neurons
An output layer consisting of 10 softmax neurons, one for each of the 10 classes of digits
Of these three, the input layer is the most straightforward ...