Skip to Content
R Deep Learning Essentials - Second Edition
book

R Deep Learning Essentials - Second Edition

by Mark Hodnett, Joshua F. Wiley
August 2018
Intermediate to advanced
378 pages
9h 9m
English
Packt Publishing
Content preview from R Deep Learning Essentials - Second Edition

Long short term memory model

LSTMs are designed to learn long-term dependencies. Similar to RNNs, they are chained and have four internal neural network layers. They split the state into two parts, where one part manages short-term state and the other adds long-term state. LSTMs have gates which control how memories are stored. The input gate controls which part of the input should be added to the long-term memory. The forget gate controls the part of long-term memory that should be forgotten. The final gate, the output gate, controls which part of the long-term memory should be in the output. This is a brief description of LSTMs – a good reference for more details is http://colah.github.io/posts/2015-08-Understanding-LSTMs/.

The code for ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

R Deep Learning Cookbook

R Deep Learning Cookbook

PKS Prakash, Achyutuni Sri Krishna Rao
Hands-On Deep Learning with R

Hands-On Deep Learning with R

Rodger Devine, Michael Pawlus
R: Unleash Machine Learning Techniques

R: Unleash Machine Learning Techniques

Raghav Bali, Dipanjan Sarkar, Brett Lantz, Cory Lesmeister
Deep Learning with R Cookbook

Deep Learning with R Cookbook

Swarna Gupta, Rehan Ali Ansari, Dipayan Sarkar

Publisher Resources

ISBN: 9781788992893Supplemental Content