RNN architectures

NLP has always been considered to be an excellent use case for LSTMs and RNN-type neural architectures. LSTMs and RNNs use sequential processing. NLP has always been considered one of the biggest use cases because the meaning of any sentence is context-based. The meaning of a word can be considered to have meaning based on all the words that came before it:

Now, when you are running an LSTM network, you need to convert the words into an embedding layer. Generally in such cases, a random initializer is used. But, you probably should be able to increase the performance of the model using a fastText model. Let's take a look ...

Get fastText Quick Start Guide now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.