Skip to Content
Hands-On Mathematics for Deep Learning
book

Hands-On Mathematics for Deep Learning

by Jay Dawani
June 2020
Intermediate to advanced
364 pages
13h 56m
English
Packt Publishing
Content preview from Hands-On Mathematics for Deep Learning

Transposed convolutions

We know that applying a convolution repeatedly to an image reduces its size, but what if we would like to go in the opposite direction; that is, go from the shape of the output to the shape of the input while still maintaining local connectivity. To do this, we use transposed convolution, which draws its name from matrix transposition (which you should remember from Chapter 1, Vector Calculus).

Let's suppose we have a 4 × 4 input and a 3 × 3 kernel. Then, we can rewrite the kernel as a 4 × 16 matrix, which we can use for matrix multiplications to carry out our convolutions. This looks as follows:

If you look closely, ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Math for Deep Learning

Math for Deep Learning

Ronald T. Kneusel
Deep Learning with PyTorch

Deep Learning with PyTorch

Eli Stevens, Thomas Viehmann, Luca Pietro Giovanni Antiga

Publisher Resources

ISBN: 9781838647292