Skip to Content
R Deep Learning Essentials - Second Edition
book

R Deep Learning Essentials - Second Edition

by Mark Hodnett, Joshua F. Wiley
August 2018
Intermediate to advanced
378 pages
9h 9m
English
Packt Publishing
Content preview from R Deep Learning Essentials - Second Edition

Dropout

Dropout is a form of regularization which aims to prevent a model from overfitting. Overfitting is when the model is memorizing parts of the training dataset, but is not as accurate on unseen test data. When you build a model, you can check if overfitting is a problem by looking at the gap between the accuracy on the training set against the accuracy on the test set. If performance is much better on the training dataset, then the model is overfitting. Dropout refers to removing nodes randomly from a network temporarily during training. It is usually only applied to hidden layers, and not input layers. Here is an example of dropout applied to a neural network:

Figure 5.11: An example of dropout in a deep learning model

For each forward ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

R Deep Learning Cookbook

R Deep Learning Cookbook

PKS Prakash, Achyutuni Sri Krishna Rao
Hands-On Deep Learning with R

Hands-On Deep Learning with R

Rodger Devine, Michael Pawlus
R: Unleash Machine Learning Techniques

R: Unleash Machine Learning Techniques

Raghav Bali, Dipanjan Sarkar, Brett Lantz, Cory Lesmeister
Deep Learning with R Cookbook

Deep Learning with R Cookbook

Swarna Gupta, Rehan Ali Ansari, Dipayan Sarkar

Publisher Resources

ISBN: 9781788992893Supplemental Content