Skip to Content
R Deep Learning Essentials - Second Edition
book

R Deep Learning Essentials - Second Edition

by Mark Hodnett, Joshua F. Wiley
August 2018
Intermediate to advanced
378 pages
9h 9m
English
Packt Publishing
Content preview from R Deep Learning Essentials - Second Edition

Use case – improving out-of-sample model performance using dropout

Dropout is a novel approach to regularization that is particularly valuable for large and complex deep neural networks. For a much more detailed exploration of dropout in deep neural networks, see Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., and Salakhutdinav, R. (2014). The concept behind dropout is actually quite straightforward. During the training of the model, units (for example, input and hidden neurons) are probabilistically dropped along with all connections to and from them.

For example, the following diagram is an example of what might happen at each step of training for a model where hidden neurons and their connections are dropped with a probability ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

R Deep Learning Cookbook

R Deep Learning Cookbook

PKS Prakash, Achyutuni Sri Krishna Rao
Hands-On Deep Learning with R

Hands-On Deep Learning with R

Rodger Devine, Michael Pawlus
R: Unleash Machine Learning Techniques

R: Unleash Machine Learning Techniques

Raghav Bali, Dipanjan Sarkar, Brett Lantz, Cory Lesmeister
Deep Learning with R Cookbook

Deep Learning with R Cookbook

Swarna Gupta, Rehan Ali Ansari, Dipayan Sarkar

Publisher Resources

ISBN: 9781788992893Supplemental Content