Using LSTMs to generate text
We have explored LSTMs in text classification. Now, we will look at how to generate text that you would see in a novel, blog post, or children’s storybook, ensuring that is coherent and consistent with what we expect from these types of texts. LSTMs prove useful here, due to their ability to capture and remember intricate patterns for long sequences. When we train an LSTM on a large volume of text data, we allow it to learn the linguistic structure, style, and nuances. It can apply this to generate new sentences in line with the style and approach of the training set.
Let’s imagine we are playing a word prediction game with our friends. The goal is to coin a story in which each friend comes up with a word to continue the story. To begin, we have a set of words, which we will call the seed, to set the tone for our story. From the seed sentence, each friend contributes a subsequent word until we have a complete story. We can also apply this...