Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Mastering Numerical Computing with NumPy

You're reading from   Mastering Numerical Computing with NumPy Master scientific computing and perform complex operations with ease

Arrow left icon
Product type Paperback
Published in Jun 2018
Publisher Packt
ISBN-13 9781788993357
Length 248 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (3):
Arrow left icon
Tiago Antao Tiago Antao
Author Profile Icon Tiago Antao
Tiago Antao
Mert Cuhadaroglu Mert Cuhadaroglu
Author Profile Icon Mert Cuhadaroglu
Mert Cuhadaroglu
Umit Mert Cakmak Umit Mert Cakmak
Author Profile Icon Umit Mert Cakmak
Umit Mert Cakmak
Arrow right icon
View More author details
Toc

Table of Contents (11) Chapters Close

Preface 1. Working with NumPy Arrays FREE CHAPTER 2. Linear Algebra with NumPy 3. Exploratory Data Analysis of Boston Housing Data with NumPy Statistics 4. Predicting Housing Prices Using Linear Regression 5. Clustering Clients of a Wholesale Distributor Using NumPy 6. NumPy, SciPy, Pandas, and Scikit-Learn 7. Advanced Numpy 8. Overview of High-Performance Numerical Computing Libraries 9. Performance Benchmarks 10. Other Books You May Enjoy

Computing gradient

When you have a linear line, you take the derivative so the derivative shows the slope of this line. Gradient is a generalization of the derivative when you have a multiple variable in your function, therefore the result of gradient is actually a vector function rather than a scalar value in derivative. The main goal of ML is actually finding the best model that fits your data. You can evaluate the meaning of the best as minimizing your loss function or objective function. Gradient is used for finding the value of the coefficients or a function that will minimize your loss or cost function. A well-known way of finding optimum points is taking the derivative of the objective function then setting it to zero to find your model coefficients. If you have more than one coefficient then it becomes a gradient rather than a derivative, and it becomes a vector equation...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image