18461

Benchmarking and Optimization of Gradient Boosted Decision Tree Algorithms

Andreea Anghel, Nikolaos Papandreou, Thomas Parnell, Alessandro De Palma, Haralampos Pozidis
IBM Research – Zurich, Ruschlikon, Switzerland
arXiv:1809.04559 [cs.LG], (12 Sep 2018)

@article{anghel2018benchmarking,

   title={Benchmarking and Optimization of Gradient Boosted Decision Tree Algorithms},

   author={Anghel, Andreea and Papandreou, Nikolaos and Parnell, Thomas and Palma, Alessandro De and Pozidis, Haralampos},

   year={2018},

   month={sep},

   archivePrefix={"arXiv"},

   primaryClass={cs.LG}

}

Download Download (PDF)   View View   Source Source   

1984

views

Gradient boosted decision trees (GBDTs) have seen widespread adoption in academia, industry and competitive data science due to their state-of-the-art performance in a wide variety of machine learning tasks. In this paper, we present an extensive empirical comparison of XGBoost, LightGBM and CatBoost, three popular GBDT algorithms, to aid the data science practitioner in the choice from the multitude of available implementations. Specifically, we evaluate their behavior on four large-scale datasets with varying shapes, sparsities and learning tasks, in order to evaluate the algorithms’ generalization performance, training times (on both CPU and GPU) and their sensitivity to hyper-parameter tuning. In our analysis, we first make use of a distributed grid-search to benchmark the algorithms on fixed configurations, and then employ a state-of-the-art algorithm for Bayesian hyper-parameter optimization to fine-tune the models.
Rating: 4.0/5. From 3 votes.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: