Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Machine Learning with C++

You're reading from   Hands-On Machine Learning with C++ Build, train, and deploy end-to-end machine learning and deep learning pipelines

Arrow left icon
Product type Paperback
Published in May 2020
Publisher Packt
ISBN-13 9781789955330
Length 530 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Author (1):
Arrow left icon
Kirill Kolodiazhnyi Kirill Kolodiazhnyi
Author Profile Icon Kirill Kolodiazhnyi
Kirill Kolodiazhnyi
Arrow right icon
View More author details
Toc

Table of Contents (19) Chapters Close

Preface 1. Section 1: Overview of Machine Learning
2. Introduction to Machine Learning with C++ FREE CHAPTER 3. Data Processing 4. Measuring Performance and Selecting Models 5. Section 2: Machine Learning Algorithms
6. Clustering 7. Anomaly Detection 8. Dimensionality Reduction 9. Classification 10. Recommender Systems 11. Ensemble Learning 12. Section 3: Advanced Examples
13. Neural Networks for Image Classification 14. Sentiment Analysis with Recurrent Neural Networks 15. Section 4: Production and Deployment Challenges
16. Exporting and Importing Models 17. Deploying Models on Mobile and Cloud Platforms 18. Other Books You May Enjoy

Exploring linear methods for dimension reduction

This section describes the most popular linear methods that are used for dimension reduction.

Principal component analysis

Principal component analysis (PCA) is one of the most intuitively simple and frequently used methods for applying dimension reduction to data and projecting it onto an orthogonal subspace of features. In a very general form, it can be represented as the assumption that all our observations look like some ellipsoid in the subspace of our original space. Our new basis in this space coincides with the axes of this ellipsoid. This assumption allows us to get rid of strongly correlated features simultaneously since the basis vectors of the space we project them...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image