A Keras LSTM model trained on small wikipedia dataset
A Tensorflow rnn model trained on Reddit comment datase
Pre-trained word vectors are downloaded from stanford GloVe Vectors
The Keras character level LSTM model showed interesting result as the following after 35 epoches of training. The sentences starts to make sense after the loss decreased to less than 1.
the early 1990s . He received an average relationship . However , makinum metelent differed from earlier advantage of rifling . The regimental command post of the region of the presence of sites and successor little @-@ battery ' .
= = Early legal career = =
The game was used by August 1942 , established a transition of an allust corperatography for probably also light to her career in the close . Planning also increased their offices as well as Clara conclude with the Great Fire of An
https://machinelearningmastery.com/tune-lstm-hyperparameters-keras-time-series-forecasting/
https://magenta.tensorflow.org/datasets/nsynth
https://datasets.maluuba.com/NewsQA/dl
https://web.stanford.edu/class/cs224n/reports/2762029.pdf
https://arxiv.org/pdf/1609.05284.pdf
https://web.stanford.edu/class/cs224n/reports/2761214.pdf
https://arxiv.org/pdf/1607.04423.pdf
http://yerevann.github.io/2017/08/25/challenges-of-reproducing-r-net-neural-network-using-keras/
SLING: https://arxiv.org/pdf/1710.07032.pdf
https://research.googleblog.com/2017/11/sling-natural-language-frame-semantic.html
YOLO: https://arxiv.org/pdf/1612.08242.pdf
https://arxiv.org/pdf/1411.4555.pdf
https://arxiv.org/pdf/1608.07249.pdf
http://www.deeplearningbook.org/
https://arxiv.org/pdf/1508.06615.pdf
https://richliao.github.io/supervised/classification/2016/12/26/textclassifier-RNN/
https://dumps.wikimedia.org/zhwiki/latest/
https://arxiv.org/pdf/1603.06155.pdf
https://www.kdnuggets.com/datasets/index.html http://freeconnection.blogspot.ca/2016/04/conversational-datasets-for-train.html