March 2020
Beginner to intermediate
342 pages
8h 38m
English
There is no such thing as a perfect replacement for the sigmoid. Different activation functions work well in different circumstances, and researchers keep coming up with brand-new ones. That being said, one activation function has proven so broadly useful that it’s become a default of sorts. Let’s talk about it.
The go-to replacement for the sigmoid these days is the rectified linear unit, or ReLU for friends. Compared with the sigmoid, the ReLU is surprisingly simple. Here’s a Python implementation of it:
| | def relu(z): |
| | if z <= 0: |
| | return 0 |
| | else: |
| | return z |
And the following diagram illustrates what it looks like:
The ReLU is composed of two straight segments. However, taken together they add ...