Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Transfer Learning with Python

You're reading from   Hands-On Transfer Learning with Python Implement advanced deep learning and neural network models using TensorFlow and Keras

Arrow left icon
Product type Paperback
Published in Aug 2018
Publisher Packt
ISBN-13 9781788831307
Length 438 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Nitin Panwar Nitin Panwar
Author Profile Icon Nitin Panwar
Nitin Panwar
Raghav Bali Raghav Bali
Author Profile Icon Raghav Bali
Raghav Bali
Tamoghna Ghosh Tamoghna Ghosh
Author Profile Icon Tamoghna Ghosh
Tamoghna Ghosh
Dipanjan Sarkar Dipanjan Sarkar
Author Profile Icon Dipanjan Sarkar
Dipanjan Sarkar
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Machine Learning Fundamentals FREE CHAPTER 2. Deep Learning Essentials 3. Understanding Deep Learning Architectures 4. Transfer Learning Fundamentals 5. Unleashing the Power of Transfer Learning 6. Image Recognition and Classification 7. Text Document Categorization 8. Audio Event Identification and Classification 9. DeepDream 10. Style Transfer 11. Automated Image Caption Generator 12. Image Colorization 13. Other Books You May Enjoy

Constructing a custom optimizer

The objective is to iteratively minimize the overall loss with the help of an optimization algorithm. In the paper by Gatys et al., optimization was done using the L-BFGS algorithm, which is an optimization algorithm based on Quasi-Newton methods, which are popularly used for solving non-linear optimization problems and parameter estimation. This method usually converges faster than standard gradient descent.

SciPy has an implementation available in scipy.optimize.fmin_l_bfgs_b(); however, limitations include the function being applicable only to flat one-dimensional vectors, unlike three-dimensional image matrices that we are dealing with, and the fact that the value of loss function and gradients need to be passed as two separate functions. We build an Evaluator class based on patterns, followed by keras creator François Chollet, to compute...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image