A powerful and popular recurrent neural network is the long short-term model network or LSTM.
It is widely used because the architecture overcomes the vanishing and exploding gradient problem that plagues all recurrent neural networks, allowing very large and very deep networks to be created.
Like other recurrent neural networks, LSTM networks maintain state, and the specifics of how this is implemented in Keras framework can be confusing.
Here you will discover exactly how state is maintained in LSTM networks by the Keras deep learning library.
By the end you will know:
Next: Text Generation with LSTM Recurrent Neural Networks with Keras