Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Computer Vision with TensorFlow 2

You're reading from   Hands-On Computer Vision with TensorFlow 2 Leverage deep learning to create powerful image processing apps with TensorFlow 2.0 and Keras

Arrow left icon
Product type Paperback
Published in May 2019
Publisher Packt
ISBN-13 9781788830645
Length 372 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Eliot Andres Eliot Andres
Author Profile Icon Eliot Andres
Eliot Andres
Benjamin Planche Benjamin Planche
Author Profile Icon Benjamin Planche
Benjamin Planche
Arrow right icon
View More author details
Toc

Table of Contents (16) Chapters Close

Preface 1. Section 1: TensorFlow 2 and Deep Learning Applied to Computer Vision FREE CHAPTER
2. Computer Vision and Neural Networks 3. TensorFlow Basics and Training a Model 4. Modern Neural Networks 5. Section 2: State-of-the-Art Solutions for Classic Recognition Problems
6. Influential Classification Tools 7. Object Detection Models 8. Enhancing and Segmenting Images 9. Section 3: Advanced Concepts and New Frontiers of Computer Vision
10. Training on Complex and Scarce Datasets 11. Video and Recurrent Neural Networks 12. Optimizing Models and Deploying on Mobile Devices 13. Migrating from TensorFlow 1 to TensorFlow 2 14. Assessments 15. Other Books You May Enjoy

Teaching our network to classify

So far, we have only implemented the feed-forward functionality for our network and its layers. First, let's update our FullyConnectedLayer class so that we can add methods for backpropagation and optimization:

class FullyConnectedLayer(object):
# [...] (code unchanged)
def __init__(self, num_inputs, layer_size, activation_fn, d_activation_fn):
# [...] (code unchanged)
self.d_activation_fn = d_activation_fn # Deriv. activation function
self.x, self.y, self.dL_dW, self.dL_db = 0, 0, 0, 0 # Storage attr.

def forward(self, x):
z = np.dot(x, self.W) + self.b
self.y = self.activation_fn(z)
self.x = x # we store values for back-propagation
return self.y

def backward(self, dL_dy):
"""Back-propagate the loss."""
dy_dz = self.d_activation_fn(self.y) # = f'
dL_dz = (dL_dy * dy_dz) # dL/dz = dL/dy * dy/dz = l'_{k+1} * f'
dz_dw...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image