9 Improving retention with long short-term memory networks
This chapter covers
- Adding deeper memory to recurrent neural nets
- Gating information inside neural nets
- Classifying and generating text
- Modeling language patterns
For all the benefits recurrent neural nets provide for modeling relationships, and therefore possibly causal relationships, in sequence data they suffer from one main deficiency: a token’s effect is almost completely lost by the time two tokens have passed.[1] Any effect the first node has on the third node (two time steps after the first time step) will be thoroughly stepped on by new data introduced in the intervening time step. This is important to the basic structure of the net, but it prevents the common case in human ...
Get Natural Language Processing in Action now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.