The major theme of this chapter was generating text automatically using RNNs. We started the chapter with a discussion about language models and their applications in the real world. We then carried out an in-depth overview of recurrent neural networks and their suitability for language model tasks. The differences between traditional feedforward networks and RNNs were discussed to get a clearer understanding of RNNs. We then went on to discuss problems and solutions related to the exploding gradients and vanishing gradients experienced by RNNs. After acquiring a detailed theoretical foundation of RNNs, we went ahead with implementing a character-level language model with an RNN. We used Alice's Adventures in Wonderland as a text corpus input to train the RNN model and then generated a string as output. Finally, we discussed some ideas for improving our character...





















































