Language modeling is an essential task for many applications such as speech recognition, machine translation and more. In this section, we'll try to mimic the training process of RNNs and get a deeper understanding of how these networks work. We'll build a language model that operate over characters. So, we will feed our network with a chunk of text with the purpose of trying to build a probability distribution of the next character given the previous ones which will allow us to generate text similar to the one we feed as an input in the training process.
For example, suppose we have a language with only four letters as its vocabulary, helo.
The task is to train a recurrent neural network on a specific input ...