Introduction to recurrent neural networks
Recurrent neural networks (RNNs) are the de facto implementation for sequential data processing. As the name indicates, RNNs recur through the data holding the information from the previous run and try to find the meaning of the sequence, just like how humans do.
Although the vanilla RNN, the unrolling of a simple RNN cell for each unit in the input, was a revolutionary idea, it failed to provide production-ready results. The major hindrance was the long-term dependency problem. When the length of the input sequence is increased, the network won't be able to remember information from the initial units (words if it is natural language) when it reaches the last cells. We'll see what an RNN cell contains and how it can be unrolled in the upcoming sections.
Several iterations and years of research yielded a few different approaches to RNN architectural design. The state-of-the-art models now use long short-term memory (LSTM...