Chapter 10

Explaining Convolutional Neural Networks

IN THIS CHAPTER

Bullet Introducing the basics of computer vision

Bullet Determining how convolutional neural networks work

Bullet Recreating a LeNet5 network using Keras

Bullet Explaining how convolutions see the world

When you look inside deep learning, you may be surprised to find a lot of old technology, but amazingly, everything works as it never has before because researchers finally know how to make some simple, older solutions work together. As a result, big data can automatically filter, process, and transform data.

For instance, novel activations like Rectified Linear Units (ReLU), discussed in previous chapters, aren’t new, but you see them used in new ways. ReLU is a neural networks function that leaves positive values untouched and turns negative ones into zero; you can find a first reference to ReLU in a scientific paper by Hahnloser and others from 2000. Also, the image recognition capabilities that made deep learning so popular a few years ago aren’t new, either.

In recent years, deep learning achieved great momentum thanks to the ability to ...

Get Deep Learning For Dummies now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.