Chapter 10
Explaining Convolutional Neural Networks
IN THIS CHAPTER
Introducing the basics of computer vision
Determining how convolutional neural networks work
Recreating a LeNet5 network using Keras
Explaining how convolutions see the world
When you look inside deep learning, you may be surprised to find a lot of old technology, but amazingly, everything works as it never has before because researchers finally know how to make some simple, older solutions work together. As a result, big data can automatically filter, process, and transform data.
For instance, novel activations like Rectified Linear Units (ReLU), discussed in previous chapters, aren’t new, but you see them used in new ways. ReLU is a neural networks function that leaves positive values untouched and turns negative ones into zero; you can find a first reference to ReLU in a scientific paper by Hahnloser and others from 2000. Also, the image recognition capabilities that made deep learning so popular a few years ago aren’t new, either.
In recent years, deep learning achieved great momentum thanks to the ability to ...
Get Deep Learning For Dummies now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.