Srikanth Pagadala

Tune the Number and Size of Decision Trees with XGBoost

11 Aug 2016

Gradient boosting involves the creation and addition of decision trees sequentially, each attempting to correct the mistakes of the learners that came before it.

This raises the question as to how many trees (weak learners or estimators) to configure in your gradient boosting model and how big each tree should be.

Here you will discover how to design a systematic experiment to select the number and size of decision trees to use on your problem.

By the end you will know:

  • How to evaluate the effect of adding more decision trees to your XGBoost model?
  • How to evaluate the effect of creating larger decision trees to your XGBoost model?
  • How to investigate the relationship between the number and depth of trees on your problem?

Source Code

Report

Next: Tune Learning Rate for Gradient Boosting with XGBoost