Here are some ways of improving the accuracy by tuning hyperparameters, such as the number of hidden layers, the neurons in each hidden layer, the number of epochs, and the activation function. The current implementation of the H2O-based deep learning model supports the following activation functions:
- ExpRectifier
- ExpRectifierWithDropout
- Maxout
- MaxoutWithDropout
- Rectifier
- RectifierWthDropout
- Tanh
- TanhWithDropout
Apart from the Tanh one, I have not tried other activation functions for this project. However, you should definitely try.
One of the biggest advantages of using H2O-based deep learning algorithms is that we can take the relative variable/feature importance. In previous chapters, we have ...