Skip to Content
Python Deep Learning - Second Edition
book

Python Deep Learning - Second Edition

by Ivan Vasilev, Daniel Slater, Gianmario Spacagna, Peter Roelants, Valentino Zocca
January 2019
Intermediate to advanced
386 pages
11h 13m
English
Packt Publishing
Content preview from Python Deep Learning - Second Edition

Gated recurrent units

A gated recurrent unit (GRU) is a type of recurrent block that was introduced in 2014 by Kyunghyun Cho et al. (https://arxiv.org/abs/1406.1078, https://arxiv.org/abs/1412.3555), as an improvement over LSTM (see the following diagram). A GRU unit usually has similar or better performance to a LSTM, but it does so with fewer parameters and operations:

A GRU cell

Similar to the "classic" RNN, a GRU cell has a single hidden state, ht. You can think of it as a combination of the hidden and cell states of an LSTM. The GRU cell has two gates:

  • An update gate zt, which is a combination of the input and forget LSTM gates. It decides ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Deep Learning

Python Deep Learning

Valentino Zocca, Gianmario Spacagna, Daniel Slater, Peter Roelants

Publisher Resources

ISBN: 9781789348460Supplemental Content