March 2020
Beginner to intermediate
342 pages
8h 38m
English
Here is a reprint of forward, a core function of our perceptron:
| | def forward(X, w): |
| | weighted_sum = np.matmul(X, w) |
| | return sigmoid(weighted_sum) |
forward implements the operation that we called “forward propagation”: it calculates the system’s outputs from the system’s inputs. In the case of the perceptron, it applies a weighted sum followed by a sigmoid. In the case of a neural network, things become slightly more complicated.
In fact, this is where the name “forward propagation” really comes into its own: passing an MNIST image through a neural network is like propagating data “forward” through the network’s layers, from input to hidden to output.
The first step of forward propagation is the same as a regular ...