Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Machine Learning Using TensorFlow Cookbook

You're reading from   Machine Learning Using TensorFlow Cookbook Create powerful machine learning algorithms with TensorFlow

Arrow left icon
Product type Paperback
Published in Feb 2021
Publisher Packt
ISBN-13 9781800208865
Length 416 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (3):
Arrow left icon
Konrad Banachewicz Konrad Banachewicz
Author Profile Icon Konrad Banachewicz
Konrad Banachewicz
Luca Massaron Luca Massaron
Author Profile Icon Luca Massaron
Luca Massaron
Alexia Audevart Alexia Audevart
Author Profile Icon Alexia Audevart
Alexia Audevart
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Getting Started with TensorFlow 2.x 2. The TensorFlow Way FREE CHAPTER 3. Keras 4. Linear Regression 5. Boosted Trees 6. Neural Networks 7. Predicting with Tabular Data 8. Convolutional Neural Networks 9. Recurrent Neural Networks 10. Transformers 11. Reinforcement Learning with TensorFlow and TF-Agents 12. Taking TensorFlow to Production 13. Other Books You May Enjoy
14. Index

Working with multiple layers

Now that we have covered multiple operations, we will cover how to connect various layers that have data propagating through them. In this recipe, we will introduce how to best connect various layers, including custom layers. The data we will generate and use will be representative of small random images. It is best to understand this type of operation with a simple example and see how we can use some built-in layers to perform calculations. The first layer we will explore is called a moving window. We will perform a small moving window average across a 2D image and then the second layer will be a custom operation layer.

Moving windows are useful for everything related to time series. Though there are layers specialized for sequences, a moving window may prove useful when you are analyzing, for instance, MRI scans (neuroimages) or sound spectrograms.

Moreover, we will see that the computational graph can get large and hard to look at. To address this, we will also introduce ways to name operations and create scopes for layers.

Getting ready

To start, you have to load the usual packages – NumPy and TensorFlow – using the following:

import TensorFlow as tf
import NumPy as np

Let's now progress to the recipe. This time things are getting more complex and interesting.

How to do it...

We proceed with the recipe as follows.

First, we create our sample 2D image with NumPy. This image will be a 4 x 4 pixel image. We will create it in four dimensions; the first and last dimensions will have a size of 1 (we keep the batch dimension distinct, so you can experiment with changing its size). Note that some TensorFlow image functions will operate on four-dimensional images. Those four dimensions are image number, height, width, and channel, and to make it work with one channel, we explicitly set the last dimension to 1, as follows:

batch_size = [1]
x_shape = [4, 4, 1]
x_data = tf.random.uniform(shape=batch_size + x_shape)

To create a moving window average across our 4 x 4 image, we will use a built-in function that will convolute a constant across a window of the shape 2 x 2. The function we will use is conv2d(); this function is quite commonly used in image processing and in TensorFlow.

This function takes a piecewise product of the window and a filter we specify. We must also specify a stride for the moving window in both directions. Here, we will compute four moving window averages: the upper-left, upper-right, lower-left, and lower-right four pixels. We do this by creating a 2 x 2 window and having strides of length 2 in each direction. To take the average, we will convolute the 2 x 2 window with a constant of 0.25, as follows:

def mov_avg_layer(x):
    my_filter = tf.constant(0.25, shape=[2, 2, 1, 1]) 
    my_strides = [1, 2, 2, 1] 
    layer = tf.nn.conv2d(x, my_filter, my_strides, 
                         padding='SAME', name='Moving_Avg_Window')
    return layer

Note that we are also naming this layer Moving_Avg_Window by using the name argument of the function.

To figure out the output size of a convolutional layer, we can use the following formula: Output = (WF + 2P)/S + 1), where W is the input size, F is the filter size, P is the padding of zeros, and S is the stride.

Now, we define a custom layer that will operate on the 2 x 2 output of the moving window average. The custom function will first multiply the input by another 2 x 2 matrix tensor, and then add 1 to each entry. After this, we take the sigmoid of each element and return the 2 x 2 matrix. Since matrix multiplication only operates on two-dimensional matrices, we need to drop the extra dimensions of our image that are of size 1. TensorFlow can do this with the built-in squeeze() function. Here, we define the new layer:

    def custom_layer(input_matrix): 
        input_matrix_sqeezed = tf.squeeze(input_matrix) 
        A = tf.constant([[1., 2.], [-1., 3.]]) 
        b = tf.constant(1., shape=[2, 2]) 
        temp1 = tf.matmul(A, input_matrix_sqeezed) 
        temp = tf.add(temp1, b) # Ax + b 
        return tf.sigmoid(temp)  

Now, we have to arrange the two layers in the network. We will do this by calling one layer function after the other, as follows:

first_layer = mov_avg_layer(x_data) 
second_layer = custom_layer(first_layer)

Now, we just feed in the 4 x 4 image into the functions. Finally, we can check the result, as follows:

print(second_layer)
 
tf.Tensor(
[[0.9385519  0.90720266]
 [0.9247799  0.82272065]], shape=(2, 2), dtype=float32)

Let's now understand more in depth how it works.

How it works...

The first layer is named Moving_Avg_Window. The second is a collection of operations called Custom_Layer. Data processed by these two layers is first collapsed on the left and then expanded on the right. As shown by the example, you can wrap all the layers into functions and call them, one after the other, in a way that later layers process the outputs of previous ones.

You have been reading a chapter from
Machine Learning Using TensorFlow Cookbook
Published in: Feb 2021
Publisher: Packt
ISBN-13: 9781800208865
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image