6RNN and LSTM
Recurrent neural network (RNN) and long short-term memory (LSTM) [1] is one of the deep learning algorithm which deals with sequence of numerical inputs enables some tasks like hand written recognition without segmentation or speech recognition. By using convolutional neural network (CNN), these tasks are difficult to handle because it is difficult to process long sequence and training is difficult because of vanishing gradient problem. So RNN and LSTM classifiers are frequently used and also along with convolution layer by the researchers to get efficient classifier. To overcome this, LSTM and GRU are designed. The working of RNN and LSTM has been shown below with examples.
6.1 Concept of RNN
In RNN, the input is processed using hidden layers. The previous layers output is pass as an input to the next layer. It [2] contains “memory” where the information related to the operations done by each node to process the input with respect to time for generating an output of hidden layers are stored. It works as other neural networks in terms of weight and bias parameters of nodes in hidden layer. The activation functions of each node are independent and are converted to dependent for all nodes. This conversion reduces the complexity while remembering the current and previous nodes operation for hidden layer and the working of simple RNN is shown in Figure 6.1.
In the above classifier time step is denoted as t, an input is denoted as X, hidden state is denoted as h. ...
Get Deep Learning and its Applications using Python now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.