Skip to Main Content
Java Deep Learning Projects
book

Java Deep Learning Projects

by Md. Rezaul Karim
June 2018
Intermediate to advanced content levelIntermediate to advanced
436 pages
10h 33m
English
Packt Publishing
Content preview from Java Deep Learning Projects

Answers to questions

Answer to question 1: The following hyperparameters are very important and must be tuned to achieve optimized results:

  • Dropout is used to randomly off certain neurons (that is, feature detectors) to prevent overfitting
  • Learning rate optimization—Adagrad can be used for feature-specific learning rate optimization
  • Regularization—L1 and/or L2 regularization
  • Gradient normalization and clipping
  • Finally, apply batch normalization to reduce internal covariate shift in training

Now, for dropout, we can add dropout in each convolutional and dense layer and in case of overfitting, the model is specifically adjusted to the training dataset, so it will not be used for generalization. Therefore, although it performs well on the ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Java Deep Learning Essentials

Java Deep Learning Essentials

Yusuke Sugomori
Machine Learning in Java - Second Edition

Machine Learning in Java - Second Edition

AshishSingh Bhatia, Bostjan Kaluza
Mastering Java Machine Learning

Mastering Java Machine Learning

Uday Kamath, Krishna Choppella

Publisher Resources

ISBN: 9781788997454Supplemental Content