Summary
In this chapter, we embarked on a voyage through the world of RNNs. We began by looking at the anatomy of RNNs and their variants, and we explored the task of classifying news articles using different model architectures. We took a step further by applying pretrained word embeddings to our best-performing model in our quest to improve it. Here, we learned how to apply pretrained word embeddings in our workflow. For our final challenge, we took on the task of building a text generator to generate children’s stories.
In the next chapter, we will examine time series, explore its unique characteristics, and uncover various methods of building forecasting models. We will tackle a time-series problem, where we will master how to prepare, train, and evaluate time-series data.