Further reading
You can refer to the following topics for more insights:
Hi ghway Networks at: https://arxiv.org/abs/1505.00387
Depth-Gated LSTM at: https://arxiv.org/abs/1508.03790
Learning Longer Memory in Recurrent N eural Networks at: https://arxiv.org/abs/1412.7753
Grid Long Short-Term Memory, Nal Kalchbrenner, Ivo Danihelka, Alex Graves
Zilly, J, Srivastava, R, Koutnik, J, Schmidhuber, J., Recurrent Highway Networks, 2016
Gal, Y, A Theoretically Grounded Application of Dropout in Recurrent Neural Networks, 2015.
Zaremba, W, Sutskever, I, Vinyals, O, Recurrent neural network regularization, 2014.
Press, O, Wolf, L, Using the Output Embedding to Improve Language Models, 2016.
Gated Feedback Recurrent Neural Networks: Junyoung Chung, Caglar Gulcehre, Kyunghyun Cho, Yoshua Bengio 2015
A Clockwork RNN: Jan Koutník, Klaus Greff, Faustino Gomez, Jürgen Schmidhuber 2014