Skip to Content
Hands-On Machine Learning for Algorithmic Trading
book

Hands-On Machine Learning for Algorithmic Trading

by Stefan Jansen
December 2018
Beginner to intermediate
684 pages
21h 9m
English
Packt Publishing
Content preview from Hands-On Machine Learning for Algorithmic Trading

Encoder-decoder architectures and the attention mechanism

The architectures discussed so far assumed that the input and output sequences have equal length. Encoder-decoder architectures, also called sequence-to-sequence (seq2seq) architectures, relax this assumption and have become very popular for machine translation and seq2seq prediction in general.

The encoder is an RNN model that maps the input space to a different space, also called the latent space, whereas the decoder function is a complementary RNN model that maps the encoded input to the target space. In the next chapter, we will cover autoencoders, which are able to learn a feature representation in an unsupervised setting using a variety of deep learning architectures.

Encoder-decoder ...

Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Machine Learning for Algorithmic Trading - Second Edition

Machine Learning for Algorithmic Trading - Second Edition

Stefan Jansen

Publisher Resources

ISBN: 9781789346411Supplemental Content