O'Reilly logo

Java Deep Learning Projects by Md. Rezaul Karim

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Answers to questions

Answer to question 1: The following hyperparameters are very important and must be tuned to achieve optimized results:

  • Dropout is used to randomly off certain neurons (that is, feature detectors) to prevent overfitting
  • Learning rate optimization—Adagrad can be used for feature-specific learning rate optimization
  • Regularization—L1 and/or L2 regularization
  • Gradient normalization and clipping
  • Finally, apply batch normalization to reduce internal covariate shift in training

Now, for dropout, we can add dropout in each convolutional and dense layer and in case of overfitting, the model is specifically adjusted to the training dataset, so it will not be used for generalization. Therefore, although it performs well on the ...

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required