December 2018
Beginner to intermediate
684 pages
21h 9m
English
Just like the feedforward and CNN architectures we covered in the previous two chapters, RNNs can be designed in a variety of ways that best capture the functional relationship and dynamic between input and output data.
In addition to the recurrent connections between the hidden states, there are several alternative approaches, including recurrent output relationships, bidirectional RNNs, and encoder-decoder architectures (you can refer to GitHub for more background references to complement this brief summary here: https://github.com/PacktPublishing/Hands-On-Machine-Learning-for-Algorithmic-Trading).