Skip to Content
Python Deep Learning
book

Python Deep Learning

by Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants
April 2017
Intermediate to advanced
406 pages
10h 15m
English
Packt Publishing
Content preview from Python Deep Learning

Dropout

Another important technique that can be applied after a pooling layer, but can also generally be applied to a fully connected layer, is to "drop" some neurons and their corresponding input and output connections randomly and periodically. In a dropout layer we specify a probability p for neurons to "drop out" stochastically. During each training period, each neuron has probability p to be dropped out from the network, and a probability (1-p) to be kept. This is to ensure that no neuron ends up relying too much on other neurons, and each neuron "learns" something useful for the network. This has two advantages: it speeds up the training, since we train a smaller network each time, and also helps in preventing over-fitting (see N. Srivastava, ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Deep Learning - Second Edition

Python Deep Learning - Second Edition

Ivan Vasilev, Daniel Slater, Gianmario Spacagna, Peter Roelants, Valentino Zocca
Python Deep Learning Projects

Python Deep Learning Projects

Matthew Lamons, Rahul Kumar, Abhishek Nagaraja

Publisher Resources

ISBN: 9781786464453Supplemental Content