June 2020
Intermediate to advanced
382 pages
11h 39m
English
The output for the first two activation functions presented in this chapter was binary. That means that they will take a set of input variables and convert them into binary outputs. ReLU is an activation function that takes a set of input variables as input and converts them into a single continuous output. In neural networks, ReLU is the most popular activation function and is usually used in the hidden layers, where we do not want to convert continuous variables into category variables.
The following diagram summarizes the ReLU activation function:

Note that when x≤ 0, that means y = 0. This means that any signal ...
Read now
Unlock full access