Chapter 9

Analyzing Sequential Data with Recurrent Neural Networks (RNNs)


check Analyzing sequential data with Recurrent Neural Networks (RNNs)

check Improving performance with Long Short-Term Memory (LSTM)

check Improving performance further with Gated Recurrent Units (GRUs)

Suppose that you want a neural network to predict the next word in the phrase “My hovercraft is full of….” As any Monty Python fan (or a casual web search) will tell you, the obvious answer is “eels.” But how can you train a neural network to arrive at the answer?

You can feed the network every sentence ever written on the Internet, but there’s still a problem. To make the prediction, the neural network needs to recognize that the words form an ordered sequence. That is, the network needs to understand that the phrase “My hovercraft is full of” is a different phrase than “full is My of hovercraft.”

None of the neural networks discussed in Chapters 1 through 8 of this book are capable of recognizing sequences. As a consequence, they can't use past analysis to solve future problems. For example, a CNN can classify an image, but it can't classify later images based on previous classifications. To make up for these ...

Get TensorFlow For Dummies now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.