PCA is a dimensionality reduction technique used to reduce a high dimensional dataset into a smaller subset of Principal Components (PC), which explain most of the variability observed in the original data. The first PC of the data is a vector along which the observations vary the most, or in other words, a linear combination of the variables in the dataset that maximizes the variance. Mathematically, the first PC minimizes the sum of the squared distances between each observation and the PC. The second PC is again a linear combination of the original variables, which captures the largest remaining variance and is subject to the constraint that is perpendicular to the first PC.
In general, we can build as many PCs as variables in the dataset. Each PC is a linear combination of the variables, orthogonal to the other components, and maximizes the remaining...