A benefit of using ensembles of decision tree methods like gradient boosting is that they can automatically provide estimates of feature importance from a trained predictive model.
Here you will discover how you can estimate the importance of features for a predictive modeling problem using the XGBoost library in Python.
By the end you will know: