RNNs are primarily popular for many NLP tasks (even if they are currently being used in different scenarios, which we will talk about in Chapter 6, Recurrent Neural Networks). What's different about RNNs? Their peculiarity is that the connections between units form a directed graph along a sequence. This means that an RNN can exhibit a dynamic temporal behavior for a given time sequence. Therefore, they can use their internal state (memory) to process sequences of inputs, while in a traditional neural network, we assume that all inputs and outputs are independent of each other. This makes RNNs suitable for cases such as those, for example, when we want to predict the next word in a sentence – it is definitely better to know which words ...
RNNs
Get Hands-On Deep Learning with Apache Spark now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.