Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
TensorFlow Machine Learning Projects

You're reading from   TensorFlow Machine Learning Projects Build 13 real-world projects with advanced numerical computations using the Python ecosystem

Arrow left icon
Product type Paperback
Published in Nov 2018
Publisher Packt
ISBN-13 9781789132212
Length 322 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Ankit Jain Ankit Jain
Author Profile Icon Ankit Jain
Ankit Jain
Dr. Amita Kapoor Dr. Amita Kapoor
Author Profile Icon Dr. Amita Kapoor
Dr. Amita Kapoor
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Overview of TensorFlow and Machine Learning FREE CHAPTER 2. Using Machine Learning to Detect Exoplanets in Outer Space 3. Sentiment Analysis in Your Browser Using TensorFlow.js 4. Digit Classification Using TensorFlow Lite 5. Speech to Text and Topic Extraction Using NLP 6. Predicting Stock Prices using Gaussian Process Regression 7. Credit Card Fraud Detection using Autoencoders 8. Generating Uncertainty in Traffic Signs Classifier Using Bayesian Neural Networks 9. Generating Matching Shoe Bags from Shoe Images Using DiscoGANs 10. Classifying Clothing Images using Capsule Networks 11. Making Quality Product Recommendations Using TensorFlow 12. Object Detection at a Large Scale with TensorFlow 13. Generating Book Scripts Using LSTMs 14. Playing Pacman Using Deep Reinforcement Learning 15. What is Next? 16. Other Books You May Enjoy

Understanding categorical cross entropy loss

 Cross entropy loss, or log loss, measures the performance of the classification model whose output is a probability between 0 and 1. Cross entropy increases as the predicted probability of a sample diverges from the actual value. Therefore, predicting a probability of 0.05 when the actual label has a value of 1 increases the cross entropy loss. 

Mathematically, for a binary classification setting, cross entropy is defined as the following equation:

Here,  is the binary indicator (0 or 1) denoting the class for the sample , while  denotes the predicted probability between 0 and 1 for that sample.

Alternatively, if there are more than two classes, we define a new term known as categorical cross entropy. It is calculated as a sum of separate loss for each class label per observation. Mathematically...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image