Recurrent neural networks can also be used as generative models.
This means that in addition to being used for predictive models (making predictions) they can learn the sequences of a problem and then generate entirely new plausible sequences for the problem domain.
Generative models like this are useful not only to study how well a model has learned a problem, but to learn more about the problem domain itself.
Here you will discover how to create a generative model for text, character-by-character using LSTM recurrent neural networks in Python with Keras.
By the end you will know:
Next: Grid Search Hyperparameters for Deep Learning Models with Keras