February 2019
Beginner to intermediate
308 pages
7h 42m
English
When designing the neural network model architecture, we also need to decide what activation functions to use for each layer. Activation functions have an important role to play in neural networks. You can think of activation functions as transformers in neural networks; they take an input value, transform the input value, and pass the transformed value to the next layer.
In this project, we will use the rectified linear unit (ReLU) and the sigmoid as our activation functions.