Hyperparameter optimization is a big part of deep learning.
The reason is that neural networks are notoriously difficult to configure and there are a lot of parameters that need to be set. On top of that, individual models can be very slow to train.
Let us learn how you can use the grid search capability from the scikit-learn python machine learning library to tune the hyperparameters of Keras deep learning models.
By the end you will know:
- How to wrap Keras models for use in scikit-learn and how to use grid search?
- How to grid search common neural network parameters such as learning rate, dropout rate, epochs and number of neurons?
- How to define your own hyperparameter tuning experiments on your own projects?
Below is a list of the topics we are going to cover:
- How to use Keras models in scikit-learn?
- How to use grid search in scikit-learn?
- How to tune batch size and training epochs?
- How to tune optimization algorithms?
- How to tune learning rate and momentum?
- How to tune network weight initialization?
- How to tune activation functions?
- How to tune dropout regularization?
- How to tune the number of neurons in the hidden layer?
Source Code
Report
Next: Improve Deep Learning Performance