Srikanth Pagadala

Tune Multithreading Support for XGBoost

10 Aug 2016

The XGBoost library for gradient boosting is designed for efficient multi-core parallel processing.

This allows it to efficiently use all of the CPU cores in your system when training.

Here you will discover the parallel processing capabilities of the XGBoost.

By the end you will know:

  • How to confirm that XGBoost multi-threading support is working on your system?
  • How to evaluate the effect of increasing the number of threads on XGBoost?
  • How to get the most out of multithreaded XGBoost when using cross validation and grid search?

Source Code

Report

Next: Tune the Number and Size of Decision Trees with XGBoost