References
The following references were provided in this chapter:
- Breiman, L. Random Forests, Machine Learning 45, 5–32 (2001): https://doi.org/10.1023/A:1010933404324.
- Chen, Tianqi and Guestrin, Carlos. (2016). XGBoost: A Scalable Tree Boosting System. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ‘16). Association for Computing Machinery, New York, NY, USA, 785–794: https://doi.org/10.1145/2939672.2939785.
- Ke, Guolin et.al. (2017), LightGBM: A Highly Efficient Gradient Boosting Decision Tree. Advances in Neural Information Processing Systems, pages 3149-3157: https://dl.acm.org/doi/pdf/10.5555/3294996.3295074.
- Prokhorenkova, Liudmila, Gusev, Gleb et al. (2018), CatBoost: unbiased boosting with categorical features. Proceedings of the 32nd International Conference on Neural Information Processing Systems (NIPS’18): https://dl.acm.org/doi/abs/10.5555/3327757.3327770.