Experimenting using a deeper network with more units in the hidden layer

After building three different neural network models with 203, 239, and 753 parameters respectively, we will now build a deeper neural network model containing a larger number of units in the hidden layers. The code used for this experiment is as follows:

# Model architecturemodel <- keras_model_sequential()model %>%         layer_dense(units = 40, activation = 'relu', input_shape = c(21)) %>%         layer_dropout(rate = 0.4) %>%         layer_dense(units = 30, activation = 'relu') %>%         layer_dropout(rate = 0.3) %>%         layer_dense(units = 20, activation = 'relu') %>%         layer_dropout(rate = 0.2) %>%         layer_dense(units = 3, activation = 'softmax')summary(model)OUTPUT__________________________________________________________________________ ...

Get Advanced Deep Learning with R now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.