October 2018
Intermediate to advanced
252 pages
6h 49m
English
Let's now create a model and add layers to it.. We can work around and modify the number of units, but if we are not sure what number to initialize with then simply initialize the units of all layers except the last one with the (number of features + number of output nodes/2), which equals 15 in our case. As explained in the following points, we have to provide an input dimension for the first layer only. relu activation refers to the rectified linear unit and sigmoid refers to the sigmoid activation function. With the help of the sigmoid activation function, we can get the probabilities of the classification, which might be beneficial in some cases to conduct further study.
For every model, there are hyperparameters that are set ...