Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Apache Spark Deep Learning Cookbook
Apache Spark Deep Learning Cookbook

Apache Spark Deep Learning Cookbook: Over 80 best practice recipes for the distributed training and deployment of neural networks using Keras and TensorFlow

Arrow left icon
Profile Icon Ahmed Sherif Profile Icon Ravindra
Arrow right icon
$29.99 $43.99
Full star icon Half star icon Empty star icon Empty star icon Empty star icon 1.7 (6 Ratings)
eBook Jul 2018 474 pages 1st Edition
eBook
$29.99 $43.99
Paperback
$54.99
Subscription
Free Trial
Renews at $19.99p/m
Arrow left icon
Profile Icon Ahmed Sherif Profile Icon Ravindra
Arrow right icon
$29.99 $43.99
Full star icon Half star icon Empty star icon Empty star icon Empty star icon 1.7 (6 Ratings)
eBook Jul 2018 474 pages 1st Edition
eBook
$29.99 $43.99
Paperback
$54.99
Subscription
Free Trial
Renews at $19.99p/m
eBook
$29.99 $43.99
Paperback
$54.99
Subscription
Free Trial
Renews at $19.99p/m

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Table of content icon View table of contents Preview book icon Preview Book

Apache Spark Deep Learning Cookbook

Creating a Neural Network in Spark

In this chapter, the following recipes will be covered:

  • Creating a dataframe in PySpark
  • Manipulating columns in a PySpark dataframe
  • Converting a PySpark dataframe into an array
  • Visualizing the array in a scatterplot
  • Setting up weights and biases for input into the neural network
  • Normalizing the input data for the neural network
  • Validating array for optimal neural network performance
  • Setting up the activation function with sigmoid
  • Creating the sigmoid derivative function
  • Calculating the cost function in a neural network
  • Predicting gender based on height and weight
  • Visualizing prediction scores

Introduction

Much of this book will focus on building deep learning algorithms with libraries in Python, such as TensorFlow and Keras. While these libraries are helpful to build deep neural networks without getting deep into the calculus and linear algebra of deep learning, this chapter will do a deep dive into building a simple neural network in PySpark to make a gender prediction based on height and weight. One of the best ways to understand the foundation of neural networks is to build a model from scratch, without any of the popular deep learning libraries. Once the foundation for a neural network framework is established, understanding and utilizing some of the more popular deep neural network libraries will become much simpler.

Creating a dataframe in PySpark

dataframes will serve as the framework for any and all data that will be used in building deep learning models. Similar to the pandas library with Python, PySpark has its own built-in functionality to create a dataframe.

Getting ready

There are several ways to create a dataframe in Spark. One common way is by importing a .txt, .csv, or .json file. Another method is to manually enter fields and rows of data into the PySpark dataframe, and while the process can be a bit tedious, it is helpful, especially when dealing with a small dataset. To predict gender based on height and weight, this chapter will build a dataframe manually in PySpark. The dataset used is as follows:

While the dataset...

Manipulating columns in a PySpark dataframe

The dataframe is almost complete; however, there is one issue that requires addressing before building the neural network. Rather than keeping the gender value as a string, it is better to convert the value to a numeric integer for calculation purposes, which will become more evident as this chapter progresses.

Getting ready

This section will require  importing the following:

  • from pyspark.sql import functions

How to do it...

This section walks through the steps for the string conversion to a numeric value...

Converting a PySpark dataframe to an array

In order to form the building blocks of the neural network, the PySpark dataframe must be converted into an array. Python has a very powerful library, numpy, that makes working with arrays simple.

Getting ready

The numpy library should be already available with the installation of the anaconda3 Python package. However, if for some reason the numpy library is not available, it can be installed using the following command at the terminal:

pip install or sudo pip install will confirm whether the requirements are already satisfied by using the requested library:

import numpy as np
...

Visualizing an array in a scatterplot

The goal of the neural network that will be developed in this chapter is to predict the gender of an individual if the height and weight are known. A powerful method for understanding the relationship between height, weight, and gender is by visualizing the data points feeding the neural network. This can be done with the popular Python visualization library matplotlib.

Getting ready

As was the case with numpy, matplotlib should be available with the installation of the anaconda3 Python package. However, if for some reason matplotlib is not available, it can be installed using the following command at the terminal:

pip install or sudo pip install will confirm...

Setting up weights and biases for input into the neural network

The framework in PySpark and the data are now complete. It is time to move on to building the neural network. Regardless of the complexity of the neural network, the development follows a similar path:

  1. Input data
  2. Add the weights and biases
  3. Sum the product of the data and weights
  1. Apply an activation function
  2. Evaluate the output and compare it to the desired outcome

This section will focus on setting the weights that create the input which feeds into the activation function.

Getting ready

A cursory understanding of the building blocks of a simple neural network is helpful in understanding this section and the rest of the chapter.  Each neural network has...

Normalizing the input data for the neural network

Neural networks work more efficiently when the inputs are normalized. This minimizes the magnitude of a particular input affecting the overall outcome over other potential inputs that have lower values of magnitude. This section will normalize the height and weight inputs of the current individuals.

Getting ready

The normalization of input values requires obtaining the mean and standard deviation of those values for the final calculation.

How to do it...

This section walks through the steps to normalize the height and weight...

Validating array for optimal neural network performance

A little bit of validation goes a long way in ensuring that our array is normalized for optimal performance within our upcoming neural network.  

Getting ready

This section will require a bit of numpy magic using the numpy.stack() function.

How to do it...

The following steps walk through validating that our array has been normalized.

  1. Execute the following step to print the mean and standard deviation of array inputs:
print('standard deviation')
print(round(X[:,0].std(axis=0),0))
print('mean...

Introduction


Much of this book will focus on building deep learning algorithms with libraries in Python, such as TensorFlow and Keras. While these libraries are helpful to build deep neural networks without getting deep into the calculus and linear algebra of deep learning, this chapter will do a deep dive into building a simple neural network in PySpark to make a gender prediction based on height and weight. One of the best ways to understand the foundation of neural networks is to build a model from scratch, without any of the popular deep learning libraries. Once the foundation for a neural network framework is established, understanding and utilizing some of the more popular deep neural network libraries will become much simpler.

Creating a dataframe in PySpark


dataframes will serve as the framework for any and all data that will be used in building deep learning models. Similar to the pandas library with Python, PySpark has its own built-in functionality to create a dataframe.

Getting ready

There are several ways to create a dataframe in Spark. One common way is by importing a .txt, .csv, or .json file. Another method is to manually enter fields and rows of data into the PySpark dataframe, and while the process can be a bit tedious, it is helpful, especially when dealing with a small dataset. To predict gender based on height and weight, this chapter will build a dataframe manually in PySpark. The dataset used is as follows:

While the dataset will be manually added to PySpark in this chapter, the dataset can also be viewed and downloaded from the following link:

https://github.com/asherif844/ApacheSparkDeepLearningCookbook/blob/master/CH02/data/HeightAndWeight.txt

Finally, we will begin this chapter and future chapters...

Manipulating columns in a PySpark dataframe


The dataframe is almost complete; however, there is one issue that requires addressing before building the neural network. Rather than keeping the gender value as a string, it is better to convert the value to a numeric integer for calculation purposes, which will become more evident as this chapter progresses.

Getting ready

This section will require  importing the following:

  • from pyspark.sql import functions

How to do it...

This section walks through the steps for the string conversion to a numeric value in the dataframe:

  • Female --> 0 
  • Male --> 1
  1. Convert a column value inside of a dataframe requires importing functions:
from pyspark.sql import functions
  1. Next, modify the gender column to a numeric value using the following script:
df = df.withColumn('gender',functions.when(df['gender']=='Female',0).otherwise(1))
  1. Finally, reorder the columns so that gender is the last column in the dataframe using the following script:
df = df.select('height', 'weight...

Converting a PySpark dataframe to an array


In order to form the building blocks of the neural network, the PySpark dataframe must be converted into an array. Python has a very powerful library, numpy, that makes working with arrays simple.

Getting ready

The numpy library should be already available with the installation of the anaconda3 Python package. However, if for some reason the numpy library is not available, it can be installed using the following command at the terminal:

pip install or sudo pip install will confirm whether the requirements are already satisfied by using the requested library:

import numpy as np

How to do it...

This section walks through the steps to convert the dataframe into an array:

  1. View the data collected from the dataframe using the following script:
df.select("height", "weight", "gender").collect()
  1. Store the values from the collection into an array called data_array using the following script:
data_array =  np.array(df.select("height", "weight", "gender").collect())
  1. Execute...

Visualizing an array in a scatterplot


The goal of the neural network that will be developed in this chapter is to predict the gender of an individual if the height and weight are known. A powerful method for understanding the relationship between height, weight, and gender is by visualizing the data points feeding the neural network. This can be done with the popular Python visualization library matplotlib.

Getting ready

As was the case with numpy, matplotlib should be available with the installation of the anaconda3 Python package. However, if for some reasonmatplotlib is not available, it can be installed using the following command at the terminal:

pip install orsudo pip install will confirm the requirements are already satisfied by using the requested library.

How to do it...

This section walks through the steps to visualize an array through a scatterplot.

  1. Import the matplotlib library and configure the library to visualize plots inside of the Jupyter notebook using the following script:
 import...
Left arrow icon Right arrow icon
Download code icon Download Code

Key benefits

  • Train distributed complex neural networks on Apache Spark
  • Use TensorFlow and Keras to train and deploy deep learning models
  • Explore practical tips to enhance performance

Description

Organizations these days need to integrate popular big data tools such as Apache Spark with highly efficient deep learning libraries if they’re looking to gain faster and more powerful insights from their data. With this book, you’ll discover over 80 recipes to help you train fast, enterprise-grade, deep learning models on Apache Spark. Each recipe addresses a specific problem, and offers a proven, best-practice solution to difficulties encountered while implementing various deep learning algorithms in a distributed environment. The book follows a systematic approach, featuring a balance of theory and tips with best practice solutions to assist you with training different types of neural networks such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs). You’ll also have access to code written in TensorFlow and Keras that you can run on Spark to solve a variety of deep learning problems in computer vision and natural language processing (NLP), or tweak to tackle other problems encountered in deep learning. By the end of this book, you'll have the skills you need to train and deploy state-of-the-art deep learning models on Apache Spark.

Who is this book for?

If you’re looking for a practical resource for implementing efficiently distributed deep learning models with Apache Spark, then this book is for you. Knowledge of core machine learning concepts and a basic understanding of the Apache Spark framework is required to get the most out of this book. Some knowledge of Python programming will also be useful.

What you will learn

  • Set up a fully functional Spark environment
  • Understand practical machine learning and deep learning concepts
  • Employ built-in machine learning libraries within Spark
  • Discover libraries that are compatible with TensorFlow and Keras
  • Explore NLP models such as word2vec and TF-IDF on Spark
  • Organize DataFrames for deep learning evaluation
  • Apply testing and training modeling to ensure accuracy
  • Access readily available code that can be reused

Product Details

Country selected
Publication date, Length, Edition, Language, ISBN-13
Publication date : Jul 13, 2018
Length: 474 pages
Edition : 1st
Language : English
ISBN-13 : 9781788471558
Category :
Languages :
Concepts :

What do you get with eBook?

Product feature icon Instant access to your Digital eBook purchase
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
OR
Modal Close icon
Payment Processing...
tick Completed

Billing Address

Product Details

Publication date : Jul 13, 2018
Length: 474 pages
Edition : 1st
Language : English
ISBN-13 : 9781788471558
Category :
Languages :
Concepts :

Packt Subscriptions

See our plans and pricing
Modal Close icon
$19.99 billed monthly
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Simple pricing, no contract
$199.99 billed annually
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts
$279.99 billed in 18 months
Feature tick icon Unlimited access to Packt's library of 7,000+ practical books and videos
Feature tick icon Constantly refreshed with 50+ new titles a month
Feature tick icon Exclusive Early access to books as they're written
Feature tick icon Solve problems while you work with advanced search and reference features
Feature tick icon Offline reading on the mobile app
Feature tick icon Choose a DRM-free eBook or Video every month to keep
Feature tick icon PLUS own as many other DRM-free eBooks or Videos as you like for just $5 each
Feature tick icon Exclusive print discounts

Frequently bought together


Stars icon
Total $ 158.97
Java Deep Learning Projects
$54.99
Apache Spark Deep Learning Cookbook
$54.99
Hands-On Deep Learning with Apache Spark
$48.99
Total $ 158.97 Stars icon
Banner background image

Table of Contents

14 Chapters
Setting Up Spark for Deep Learning Development Chevron down icon Chevron up icon
Creating a Neural Network in Spark Chevron down icon Chevron up icon
Pain Points of Convolutional Neural Networks Chevron down icon Chevron up icon
Pain Points of Recurrent Neural Networks Chevron down icon Chevron up icon
Predicting Fire Department Calls with Spark ML Chevron down icon Chevron up icon
Using LSTMs in Generative Networks Chevron down icon Chevron up icon
Natural Language Processing with TF-IDF Chevron down icon Chevron up icon
Real Estate Value Prediction Using XGBoost Chevron down icon Chevron up icon
Predicting Apple Stock Market Cost with LSTM Chevron down icon Chevron up icon
Face Recognition Using Deep Convolutional Networks Chevron down icon Chevron up icon
Creating and Visualizing Word Vectors Using Word2Vec Chevron down icon Chevron up icon
Creating a Movie Recommendation Engine with Keras Chevron down icon Chevron up icon
Image Classification with TensorFlow on Spark Chevron down icon Chevron up icon
Other Books You May Enjoy Chevron down icon Chevron up icon

Customer reviews

Top Reviews
Rating distribution
Full star icon Half star icon Empty star icon Empty star icon Empty star icon 1.7
(6 Ratings)
5 star 16.7%
4 star 0%
3 star 0%
2 star 0%
1 star 83.3%
Filter icon Filter
Top Reviews

Filter reviews by




Adnan Masood, PhD Aug 16, 2018
Full star icon Full star icon Full star icon Full star icon Full star icon 5
The tremendous impact of Artificial Intelligence and Machine learning, and the uncanny effectiveness of deep neural networks are hard to escape in both academia and industry. Meanwhile implementation grade material outlining the deep learning using Spark are not always easy to find. The manuscript you are holding in your hands (or in your e-reader) is an problem-solution oriented approach which not only shows Spark’s capabilities but also the art of possible around various machine learning and deep learning problems.Full disclosure, I am the technical reviewer of the book and wrote the foreword. It was a pleasure reading and reviewing the Ahmed Sherif and Amrith Ravindra’s work which I hope you as a reader will also find very compelling.The authors begin with helping to set up Spark for Deep Learning development by providing clear and concisely written recipes. The initial setup is naturally followed by creating a neural network, elaborating on pain points of Convolutional Neural Networks, and Recurrent Neural Networks. Later on, authors provided practical (yet simplified) use cases of predicting fire department calls with SparkML, real estate value prediction using XGBoost, predicting the stock market cost of Apple with LSTM, and creating a movie recommendation Engine with Keras. The book covers pertinent and highly relevant technologies with operational use cases like LSTMs in Generative Networks, natural language processing with TF-IDF, face recognition using deep convolutional networks, creating and visualizing word vectors using Word2Vec and image classification with TensorFlow on Spark. Beside crisp and focused writing, this wide array of highly relevant machine learning and deep learning topics give the book its core strength.I hope this book will help you leverage Apache Spark to tackle business and technology problems. Highly recommended reading.
Amazon Verified review Amazon
Todd Sep 06, 2023
Full star icon Empty star icon Empty star icon Empty star icon Empty star icon 1
$55 bucks… not worth it. There’s this website called google
Amazon Verified review Amazon
Bethany Sep 06, 2023
Full star icon Empty star icon Empty star icon Empty star icon Empty star icon 1
An absolutely disgrace from cover to cover. I’d rather eat the book than use the recipes.
Amazon Verified review Amazon
Amanda Sep 02, 2023
Full star icon Empty star icon Empty star icon Empty star icon Empty star icon 1
Wouldn’t recommend
Amazon Verified review Amazon
Patrick Sullivan Sep 06, 2023
Full star icon Empty star icon Empty star icon Empty star icon Empty star icon 1
Total Garbage. Not worth it
Amazon Verified review Amazon
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

How do I buy and download an eBook? Chevron down icon Chevron up icon

Where there is an eBook version of a title available, you can buy it from the book details for that title. Add either the standalone eBook or the eBook and print book bundle to your shopping cart. Your eBook will show in your cart as a product on its own. After completing checkout and payment in the normal way, you will receive your receipt on the screen containing a link to a personalised PDF download file. This link will remain active for 30 days. You can download backup copies of the file by logging in to your account at any time.

If you already have Adobe reader installed, then clicking on the link will download and open the PDF file directly. If you don't, then save the PDF file on your machine and download the Reader to view it.

Please Note: Packt eBooks are non-returnable and non-refundable.

Packt eBook and Licensing When you buy an eBook from Packt Publishing, completing your purchase means you accept the terms of our licence agreement. Please read the full text of the agreement. In it we have tried to balance the need for the ebook to be usable for you the reader with our needs to protect the rights of us as Publishers and of our authors. In summary, the agreement says:

  • You may make copies of your eBook for your own use onto any machine
  • You may not pass copies of the eBook on to anyone else
How can I make a purchase on your website? Chevron down icon Chevron up icon

If you want to purchase a video course, eBook or Bundle (Print+eBook) please follow below steps:

  1. Register on our website using your email address and the password.
  2. Search for the title by name or ISBN using the search option.
  3. Select the title you want to purchase.
  4. Choose the format you wish to purchase the title in; if you order the Print Book, you get a free eBook copy of the same title. 
  5. Proceed with the checkout process (payment to be made using Credit Card, Debit Cart, or PayPal)
Where can I access support around an eBook? Chevron down icon Chevron up icon
  • If you experience a problem with using or installing Adobe Reader, the contact Adobe directly.
  • To view the errata for the book, see www.packtpub.com/support and view the pages for the title you have.
  • To view your account details or to download a new copy of the book go to www.packtpub.com/account
  • To contact us directly if a problem is not resolved, use www.packtpub.com/contact-us
What eBook formats do Packt support? Chevron down icon Chevron up icon

Our eBooks are currently available in a variety of formats such as PDF and ePubs. In the future, this may well change with trends and development in technology, but please note that our PDFs are not Adobe eBook Reader format, which has greater restrictions on security.

You will need to use Adobe Reader v9 or later in order to read Packt's PDF eBooks.

What are the benefits of eBooks? Chevron down icon Chevron up icon
  • You can get the information you need immediately
  • You can easily take them with you on a laptop
  • You can download them an unlimited number of times
  • You can print them out
  • They are copy-paste enabled
  • They are searchable
  • There is no password protection
  • They are lower price than print
  • They save resources and space
What is an eBook? Chevron down icon Chevron up icon

Packt eBooks are a complete electronic version of the print edition, available in PDF and ePub formats. Every piece of content down to the page numbering is the same. Because we save the costs of printing and shipping the book to you, we are able to offer eBooks at a lower cost than print editions.

When you have purchased an eBook, simply login to your account and click on the link in Your Download Area. We recommend you saving the file to your hard drive before opening it.

For optimal viewing of our eBooks, we recommend you download and install the free Adobe Reader version 9.