Chapter 16
Hitting Complexity with Neural Networks
IN THIS CHAPTER
Upgrading the perceptron to the interconnection paradigm
Structuring neural architectures made of nodes and connections
Getting a glimpse of the backpropagation algorithm
Understanding what deep learning is and what it can achieve
As you journey in the world of machine learning, you often see metaphors from the natural world to explain the details of algorithms. This chapter presents a family of learning algorithms that directly derives inspiration from how the brain works. They are neural networks, the core algorithms of the connectionists’ tribe.
Starting with the idea of reverse-engineering how a brain processes signals, the connectionists base neural networks on biological analogies and their components, using brain terms such as neurons and axons as names. However, you’ll discover that neural networks resemble nothing more than a sophisticated kind of linear regression when you check their math formulations. Yet, these algorithms are extraordinarily effective against complex problems such as image and sound recognition, or machine language translation. They also execute quickly when predicting.
Well-devised neural networks use the name deep learning and are behind such power tools as Siri and other digital assistants. They are behind the more astonishing machine learning applications as well. For instance, you see them at work in this incredible demonstration by Microsoft CEO Rick Rashid, who is speaking ...
Get Machine Learning For Dummies now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.