Srikanth Pagadala

Feature Importance and Feature Selection with XGBoost

08 Aug 2016

A benefit of using ensembles of decision tree methods like gradient boosting is that they can automatically provide estimates of feature importance from a trained predictive model.

Here you will discover how you can estimate the importance of features for a predictive modeling problem using the XGBoost library in Python.

By the end you will know:

  • How feature importance is calculated using the gradient boosting algorithm?
  • How to plot feature importance in Python calculated by the XGBoost model?
  • How to use feature importance calculated by XGBoost to perform feature selection?

Source Code

Report

Next: Avoid Overfitting by Early Stopping with XGBoost