Introducing Recurrent Neural Networks

You might be curious why basic feed-forward networks alone aren’t ideal for processing sequences. The problem is that basic feed-forward networks don’t have any way to account for temporal dependencies.

Temporal dependencies are simply dependencies on time. When dealing with sequences, it’s common to refer to the entries in the sequence as individual timesteps, regardless of whether entries map to actual timesteps. A temporal dependency means that events that happen in the future are dependent on what has happened in the past, or what is currently happening. If you think about this in the context of natural language, it should make sense.

In a sequence of words from natural language, order matters. The ...

Get Machine Learning in Elixir now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.