RNN-LM

Check out Andrej Karpathy’s excellent blog post on the Unreasonable Effectiveness of Recurrent Neural Networks, and the corresponding basic (vanilla) RNN implementation, that learns to predict characters one at a time. To generate new text, you can start with a random character, and then select randomly from the most likely next characters (to add some randomness to each generated text).

As you can see, it loads its input data from a plain text file (line 8). Choose from a variety of datasets to train different models and sample some text:

Quick, draw!

For a better task using LSTMs, we’ll use the quick, draw! data.

The task is to classify a drawing of an object based on the pen strokes (ie. a sequence of drawing inputs). For image processing tasks (and in fact, also for speech!), convolutional neural networks are a great choice to learn image structures.

You can work your way through the respective TensorFlow Tutorial, or if you prefer to work on Colab, follow this tutorial