RNN-LM

Check out Andrej Karpathy’s excellent blog post on the Unreasonable Effectiveness of Recurrent Neural Networks, and the corresponding basic (vanilla) RNN implementation, that learns to predict characters one at a time. To generate new text, you can start with a random character, and then select randomly from the most likely next characters (to add some randomness to each generated text).

As you can see, it loads its input data from a plain text file (line 8). Choose from a variety of datasets to train different models and sample some text:

RNN using pytorch

Here are a number of related tutorials: