Coding Forward Propagation
Here is a reprint of forward, a core function of our perceptron:
| def forward(X, w): |
| weighted_sum = np.matmul(X, w) |
| return sigmoid(weighted_sum) |
forward implements the operation that we called “forward propagation”: it calculates the system’s outputs from the system’s inputs. In the case of the perceptron, it applies a weighted sum followed by a sigmoid. In the case of a neural network, things become slightly more complicated.
In fact, this is where the name “forward propagation” really comes into its own: passing an MNIST image through a neural network is like propagating data “forward” through the network’s layers, from input to hidden to output.
The first step of forward propagation is the same as a regular ...
Get Programming Machine Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.