Skip to Content
Python Deep Learning - Second Edition
book

Python Deep Learning - Second Edition

by Ivan Vasilev, Daniel Slater, Gianmario Spacagna, Peter Roelants, Valentino Zocca
January 2019
Intermediate to advanced
386 pages
11h 13m
English
Packt Publishing
Content preview from Python Deep Learning - Second Edition

Backpropagation in convolutional layers

In Chapter 2, Neural Networks, we talked about backpropagation in general, and for fully-connected layers in particular. In a fully-connected layer, an input neuron contributes to all output neurons. Because of this, when the gradient is routed back, all output neurons contribute back to the original neuron. In effect, we used the same operation of weighted sum in the forward and backward passes. The same rule applies for convolutional layers, where the neurons are locally-connected. In the Convolutional layers section, we observed how a neuron participates in the inputs of several output neurons. This is illustrated in the following diagram, where we can see a convolution operation with 3x3 filter. ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Deep Learning

Python Deep Learning

Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants

Publisher Resources

ISBN: 9781789348460Supplemental Content