- Sequence to Sequence model (Cho et al. 2014): https://arxiv.org/pdf/1406.1078.pdf
- Understanding LSTM Network: http://colah.github.io/posts/2015-08-Understanding-LSTMs/
- Stanford University lecture on LSTM: https://www.youtube.com/watch?v=QuELiw8tbx8
- Sequence to sequence learning using Neural Network: https://arxiv.org/pdf/1409.3215.pdf
- WildML article on Attention and Memory in Deep Learning and NLP: http://www.wildml.com/2016/01/attention-and-memory-in-deep-learning-and-nlp/
- OpenSubtitles: http://opus.nlpl.eu/OpenSubtitles.php
- tf.contrib.legacy_seq2seq.embedding_attention_seq2seq: https://www.tensorflow.org/api_docs/python/tf/contrib/legacy_seq2seq/embedding_attention_seq2seq
- tf.nn.sampled_softmax_loss: https://www.tensorflow.org/api_docs/python/tf/nn/sampled_softmax_loss
- BLEU score: https://www.youtube.com/watch?v=DejHQYAGb7Q...





















































