Summary
In this chapter, we covered the nuts and bolts of the linear regression model. We started by introducing the SLR model, which consists of only one input variable and one target variable, and then extended to the MLR model with two or more predictors. Both models can be assessed using RÂ 2, or more preferably, the adjusted RÂ 2 metric. Next, we discussed specific scenarios, such as working with categorical variables and interaction terms, handling nonlinear terms via transformations, working with the closed-form solution, and dealing with multicollinearity and heteroskedasticity. Lastly, we introduced widely used regularization techniques such as ridge and lasso penalties, which can be incorporated into the loss function as a penalty term and generate a regularized model, and, additionally, a sparse solution in the case of lasso regression.
In the next chapter, we will cover another type of widely used linear model: the logistic regression model.