May 2018
Beginner
490 pages
13h 16m
English
This is how to apply the ReLU activation function.
The domain of the ReLU activation function is applied to the result of the first dense operation. The ReLU activation function will output the initial input for values >=0 and will output 0 for values <0:
f(input value) = max{0, input_value)
This is how to apply the logistic sigmoid activation function.
The logistic activation function applied to the second dense operation, as described in Chapter 2, Think like a Machine, will produce a value between 0 and 1:
LS(x)={0,1}
We will now discuss the last dense layer after the LS activation function.
The last dense layer is size 1 and will classify the initial input, an image in this case:
The layers of ...
Read now
Unlock full access