Skip to Content
Deep Learning with PyTorch
book

Deep Learning with PyTorch

by Vishnu Subramanian
February 2018
Intermediate to advanced
262 pages
6h 59m
English
Packt Publishing
Content preview from Deep Learning with PyTorch

Backpropagation through time

The other important variable that we see go through the iterator is backpropagation through time (BPTT). What it actually means is, the sequence length the model needs to remember. The higher the number, the better—but the complexity of the model and the GPU memory required for the model also increase.

To understand it better, let's look at how we can split the previous batched alphabet data into sequences of length two:

a g m s

b h n t

The previous example will be passed to the model as input, and the output will be from the sequence but containing the next values:

b h n t

c I o u

For the example WikiText2, when we split the batched data, we get data of size 30, 20 for each batch where 30 is the sequence length. ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Deep Learning with PyTorch

Deep Learning with PyTorch

Eli Stevens, Thomas Viehmann, Luca Pietro Giovanni Antiga
Grokking Deep Learning

Grokking Deep Learning

Andrew W. Trask

Publisher Resources

ISBN: 9781788624336Supplemental Content