Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
TensorFlow Machine Learning Cookbook

You're reading from   TensorFlow Machine Learning Cookbook Over 60 practical recipes to help you master Google's TensorFlow machine learning library

Arrow left icon
Product type Paperback
Published in Feb 2017
Publisher Packt
ISBN-13 9781786462169
Length 370 pages
Edition 1st Edition
Languages
Arrow right icon
Author (1):
Arrow left icon
Nick McClure Nick McClure
Author Profile Icon Nick McClure
Nick McClure
Arrow right icon
View More author details
Toc

Table of Contents (13) Chapters Close

Preface 1. Getting Started with TensorFlow FREE CHAPTER 2. The TensorFlow Way 3. Linear Regression 4. Support Vector Machines 5. Nearest Neighbor Methods 6. Neural Networks 7. Natural Language Processing 8. Convolutional Neural Networks 9. Recurrent Neural Networks 10. Taking TensorFlow to Production 11. More with TensorFlow Index

Declaring Tensors

Tensors are the primary data structure that TensorFlow uses to operate on the computational graph. We can declare these tensors as variables and or feed them in as placeholders. First we must know how to create tensors.

Getting ready

When we create a tensor and declare it to be a variable, TensorFlow creates several graph structures in our computation graph. It is also important to point out that just by creating a tensor, TensorFlow is not adding anything to the computational graph. TensorFlow does this only after creating available out of the tensor. See the next section on variables and placeholders for more information.

How to do it…

Here we will cover the main ways to create tensors in TensorFlow:

  1. Fixed tensors:
    • Create a zero filled tensor. Use the following:
      zero_tsr = tf.zeros([row_dim, col_dim])
    • Create a one filled tensor. Use the following:
      ones_tsr = tf.ones([row_dim, col_dim])
    • Create a constant filled tensor. Use the following:
      filled_tsr = tf.fill([row_dim, col_dim], 42)
    • Create a tensor out of an existing constant. Use the following:
      constant_tsr = tf.constant([1,2,3])

    Note

    Note that the tf.constant() function can be used to broadcast a value into an array, mimicking the behavior of tf.fill() by writing tf.constant(42, [row_dim, col_dim])

  2. Tensors of similar shape:
    • We can also initialize variables based on the shape of other tensors, as follows:
      zeros_similar = tf.zeros_like(constant_tsr)
      ones_similar = tf.ones_like(constant_tsr)

    Note

    Note, that since these tensors depend on prior tensors, we must initialize them in order. Attempting to initialize all the tensors all at once willwould result in an error. See the section There's more… at the end of the next chapter on variables and placeholders.

  3. Sequence tensors:
    • TensorFlow allows us to specify tensors that contain defined intervals. The following functions behave very similarly to the range() outputs and numpy's linspace() outputs. See the following function:
      linear_tsr = tf.linspace(start=0, stop=1, start=3)
    • The resulting tensor is the sequence [0.0, 0.5, 1.0]. Note that this function includes the specified stop value. See the following function:
      integer_seq_tsr = tf.range(start=6, limit=15, delta=3)
    • The result is the sequence [6, 9, 12]. Note that this function does not include the limit value.
  4. Random tensors:
    • The following generated random numbers are from a uniform distribution:
      randunif_tsr = tf.random_uniform([row_dim, col_dim], minval=0, maxval=1)
    • Note that this random uniform distribution draws from the interval that includes the minval but not the maxval (minval <= x < maxval).
    • To get a tensor with random draws from a normal distribution, as follows:
      randnorm_tsr = tf.random_normal([row_dim, col_dim], mean=0.0, stddev=1.0)
    • There are also times when we wish to generate normal random values that are assured within certain bounds. The truncated_normal() function always picks normal values within two standard deviations of the specified mean. See the following:
      runcnorm_tsr = tf.truncated_normal([row_dim, col_dim], mean=0.0, stddev=1.0)
    • We might also be interested in randomizing entries of arrays. To accomplish this, there are two functions that help us: random_shuffle() and random_crop(). See the following:
      shuffled_output = tf.random_shuffle(input_tensor)
      cropped_output = tf.random_crop(input_tensor, crop_size)
    • Later on in this book, we will be interested in randomly cropping an image of size (height, width, 3) where there are three color spectrums. To fix a dimension in the cropped_output, you must give it the maximum size in that dimension:
      cropped_image = tf.random_crop(my_image, [height/2, width/2, 3])

How it works…

Once we have decided on how to create the tensors, then we may also create the corresponding variables by wrapping the tensor in the Variable() function, as follows. More on this in the next section:

my_var = tf.Variable(tf.zeros([row_dim, col_dim]))

There's more…

We are not limited to the built-in functions. We can convert any numpy array to a Python list, or constant to a tensor using the function convert_to_tensor(). Note that this function also accepts tensors as an input in case we wish to generalize a computation inside a function.

You have been reading a chapter from
TensorFlow Machine Learning Cookbook
Published in: Feb 2017
Publisher: Packt
ISBN-13: 9781786462169
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image