RNN-LM
Check out Andrej Karpathy’s excellent blog post on the Unreasonable Effectiveness of Recurrent Neural Networks, and the corresponding basic (vanilla) RNN implementation, that learns to predict characters one at a time. To generate new text, you can start with a random character, and then select randomly from the most likely next characters (to add some randomness to each generated text).
As you can see, it loads its input data from a plain text file (line 8). Choose from a variety of datasets to train different models and sample some text:
- Goethe’s Faust
- Schiller’s Die Räuber
- Thesis Titles
- POTUS on Twitter
- …or your
bash
history?
RNN using pytorch
Here are a number of related tutorials:
- Sentiment Analysis
- Practical Pytorch’s RNN Character Generation
- Ben Trevett’s basic and advanced sentiment analysis
- Gabriel Loye’s LSTM Tutorial