April 2026
461 pages
17h 56m
English
This chapter has pushed you a huge step forward. Starting from one perceptron, we built a network of multiple perceptrons that was able to solve the common XOR problem. Not only have you built up this network, you’ve also developed your own activation function. By the way, hopefully some exciting NumPy aspects have stuck. At the very least, you’ve already had intensive contact with vector and matrix multiplications.
Thinking about activation functions led us to build a feed-forward network that used the sigmoid as an activation function. As mentioned, this use will be of great benefit to us in the next chapter, as we can use it to implement learning in the multilayer network.
Read now
Unlock full access