May 2019
Intermediate to advanced
664 pages
15h 41m
English
OK, we are going to create a model function, but not the model. The key function is keras_model_sequential(). There is a ton of stuff you can specify. What I'm going to show are two hidden layers with 64 neurons each. In both layers, the activation function is relu, which I covered earlier, and they work well for a regression problem. After the first layer, I demonstrate how to incorporate a dropout layer of 30%. Then, after the second hidden layer, I incorporate L1 regularization or LASSO, which we discussed in Chapter 4, Advanced Feature Selection in Linear Models. I thought it was important to show how to use both regularization methods, so you can use and adjust them as you deem fit.
The next function within ...