Chapter 6. Regression and Regularization
In the first chapter, we briefly introduced the binary logistic regression (binomial logistic regression for a single variable) as our first test case. The purpose was to illustrate the concept of discriminative classification. There are many more regression models, starting with the ubiquitous ordinary least-square linear regression and the logistic regression [6:1].
The purpose of regression is to minimize a loss function, with the residual sum of squares (RSS) being one that is commonly used. The problem of overfitting described in the Overfitting section of Chapter 2, Hello World!, can be addressed by adding a penalty term to the loss function. The penalty term is an element of the larger concept of regularization.
The first section of this chapter will describe and implement the linear least-squares regression. The second section will introduce the concept of regularization with an implementation of the Ridge regression.
Finally, the...