June 2019
Intermediate to advanced
372 pages
6h 34m
English
By the end of this chapter, you will be able to:
The chapter aims to provide a solution to the existing drawbacks of the current architecture of RNNs.
In previous chapters, we studied text processing techniques such as word embedding, tokenization, and Term Frequency Inverse Document Frequency (TFIDF). We also learned about a specific network architecture called a Recurrent Neural Network (RNN) that has the drawback of vanishing gradients.
In this chapter, we are going to study ...
Read now
Unlock full access