Build unigram and bigram language models, implement Laplace smoothing and use the models to compute the perplexity of test corpora.
Keras implementations of three language models: character-level RNN, word-level RNN and Sentence VAE (Bowman, Vilnis et al 2016).
Tensorflow implementation of Bi-directional RNN Langauge Model
Natural language processing & computer vision models optimized for AWS
Code for upcoming TACL paper w/ Graham Neubig, "Neural Lattice Language Models".
A bare-bones NumPy implementation of "Multimodal Neural Language Models" (Kiros et al, ICML 2014)