Summary
In this chapter, you explored when to use ensemble methods versus linear regression and CART by building models with various algorithms using the housing value dataset. You compared XGBoost to linear regression, CART, gradient boosting, and random forest ensemble methods.
You learned when each of these models is a good choice and when to use deep learning. You ended this chapter by learning about the settings that control the learning process for the XGBoost algorithm.
Now that you have a broad understanding of XGBoost, you’ll learn how to handle practical problems with data. In the next chapter, you will learn about methods for cleaning data, how to best handle imbalanced data when building a classifier, and how to deal with other data-related problems.