Skip to Main Content
Natural Language Processing in Action
book

Natural Language Processing in Action

by Cole Howard, Hobson Lane, Hannes Hapke
April 2019
Intermediate to advanced content levelIntermediate to advanced
544 pages
17h 29m
English
Manning Publications
Content preview from Natural Language Processing in Action

9 Improving retention with long short-term memory networks

This chapter covers

  • Adding deeper memory to recurrent neural nets
  • Gating information inside neural nets
  • Classifying and generating text
  • Modeling language patterns

For all the benefits recurrent neural nets provide for modeling relationships, and therefore possibly causal relationships, in sequence data they suffer from one main deficiency: a token’s effect is almost completely lost by the time two tokens have passed.[1] Any effect the first node has on the third node (two time steps after the first time step) will be thoroughly stepped on by new data introduced in the intervening time step. This is important to the basic structure of the net, but it prevents the common case in human ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Natural Language Processing with PyTorch

Natural Language Processing with PyTorch

Delip Rao, Brian McMahan

Publisher Resources

ISBN: 9781617294631Supplemental ContentPublisher SupportPublisher WebsiteSupplemental ContentOtherPurchase Link