Srikanth Pagadala

Text Generation with LSTM Recurrent Neural Networks with Keras

16 Nov 2016

Recurrent neural networks can also be used as generative models.

This means that in addition to being used for predictive models (making predictions) they can learn the sequences of a problem and then generate entirely new plausible sequences for the problem domain.

Generative models like this are useful not only to study how well a model has learned a problem, but to learn more about the problem domain itself.

Here you will discover how to create a generative model for text, character-by-character using LSTM recurrent neural networks in Python with Keras.

By the end you will know:

  • Where to download a free corpus of text that you can use to train text generative models?
  • How to frame the problem of text sequences to a recurrent neural network generative model?
  • How to develop an LSTM to generate plausible text sequences for a given problem?

Source Code

Report

Next: Grid Search Hyperparameters for Deep Learning Models with Keras