March 2020
Beginner to intermediate
342 pages
8h 38m
English
Let’s see how to build a neural network, starting with the perceptron that we already have. As a reminder, here is that perceptron again—a weighted sum of the inputs, followed by a sigmoid:

In Part I of this book, we didn’t just use the perceptron as is—we also combined perceptrons in two different ways. First, we trained the perceptron with many MNIST images at once; and second, we used ten perceptrons to classify the ten possible digits. In Assembling Perceptrons, we compared those two operations to “stacking” and “parallelizing” perceptrons, respectively, as shown in the picture.
To be clear, ...