Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Hands-On Transfer Learning with Python

You're reading from   Hands-On Transfer Learning with Python Implement advanced deep learning and neural network models using TensorFlow and Keras

Arrow left icon
Product type Paperback
Published in Aug 2018
Publisher Packt
ISBN-13 9781788831307
Length 438 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Nitin Panwar Nitin Panwar
Author Profile Icon Nitin Panwar
Nitin Panwar
Raghav Bali Raghav Bali
Author Profile Icon Raghav Bali
Raghav Bali
Tamoghna Ghosh Tamoghna Ghosh
Author Profile Icon Tamoghna Ghosh
Tamoghna Ghosh
Dipanjan Sarkar Dipanjan Sarkar
Author Profile Icon Dipanjan Sarkar
Dipanjan Sarkar
Arrow right icon
View More author details
Toc

Table of Contents (14) Chapters Close

Preface 1. Machine Learning Fundamentals FREE CHAPTER 2. Deep Learning Essentials 3. Understanding Deep Learning Architectures 4. Transfer Learning Fundamentals 5. Unleashing the Power of Transfer Learning 6. Image Recognition and Classification 7. Text Document Categorization 8. Audio Event Identification and Classification 9. DeepDream 10. Style Transfer 11. Automated Image Caption Generator 12. Image Colorization 13. Other Books You May Enjoy

Training our image captioning deep learning model

Before we start training our model, since we are dealing with some complex components in our model, we use a callback in our model to reduce the learning rate in case there is a plateau in the model's accuracy across successive epochs. This is extremely helpful to change the learning rate of the model on the fly without stopping training:

from keras.callbacks import ReduceLROnPlateau 
reduce_lr = ReduceLROnPlateau(monitor='loss', factor=0.15, 
                              patience=2, min_lr=0.000005) 

Let's train our model now! We have trained our model to around 30 to 50 epochs and saved the model at around 30 epochs and again at 50 epochs:

BATCH_SIZE = 256 
EPOCHS = 30 
cap_lens = [(cl-1) for cl in tc_tokens_length] 
total_size = sum(cap_lens) 
 
history = model.fit_generator( 
  dataset_generator(processed_captions...
lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image