Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Apache Spark Deep Learning Cookbook

You're reading from   Apache Spark Deep Learning Cookbook Over 80 best practice recipes for the distributed training and deployment of neural networks using Keras and TensorFlow

Arrow left icon
Product type Paperback
Published in Jul 2018
Publisher Packt
ISBN-13 9781788474221
Length 474 pages
Edition 1st Edition
Languages
Arrow right icon
Authors (2):
Arrow left icon
Ahmed Sherif Ahmed Sherif
Author Profile Icon Ahmed Sherif
Ahmed Sherif
Amrith Ravindra Amrith Ravindra
Author Profile Icon Amrith Ravindra
Amrith Ravindra
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Setting Up Spark for Deep Learning Development 2. Creating a Neural Network in Spark FREE CHAPTER 3. Pain Points of Convolutional Neural Networks 4. Pain Points of Recurrent Neural Networks 5. Predicting Fire Department Calls with Spark ML 6. Using LSTMs in Generative Networks 7. Natural Language Processing with TF-IDF 8. Real Estate Value Prediction Using XGBoost 9. Predicting Apple Stock Market Cost with LSTM 10. Face Recognition Using Deep Convolutional Networks 11. Creating and Visualizing Word Vectors Using Word2Vec 12. Creating a Movie Recommendation Engine with Keras 13. Image Classification with TensorFlow on Spark 14. Other Books You May Enjoy

Installing and configuring Spark and prerequisites on Ubuntu Desktop

Before Spark can get up and running, there are some necessary prerequisites that need to be installed on a newly minted Ubuntu Desktop. This section will focus on installing and configuring the following on Ubuntu Desktop:

  • Java 8 or higher
  • Anaconda
  • Spark

Getting ready

The only requirement for this section is having administrative rights to install applications onto the Ubuntu Desktop.

How to do it...

This section walks through the steps in the recipe to install Python 3, Anaconda, and Spark on Ubuntu Desktop:

  1. Install Java on Ubuntu through the terminal application, which can be found by searching for the app and then locking it to the launcher on the left-hand side, as seen in the following screenshot:
  1. Perform an initial test for Java on the virtual machine by executing the following command at the terminal:
java -version
  1. Execute the following four commands at the terminal to install Java:
sudo apt-get install software-properties-common 
$ sudo add-apt-repository ppa:webupd8team/java
$ sudo apt-get update
$ sudo apt-get install oracle-java8-installer
  1. After accepting the necessary license agreements for Oracle, perform a secondary test of Java on the virtual machine by executing java -version once again in the terminal. A successful installation for Java will display the following outcome in the terminal:
$ java -version
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
  1. Next, install the most recent version of Anaconda. Current versions of Ubuntu Desktop come preinstalled with Python. While it is convenient that Python comes preinstalled with Ubuntu, the installed version is for Python 2.7, as seen in the following output:
$ python --version
Python 2.7.12
  1. The current version of Anaconda is v4.4 and the current version of Python 3 is v3.6. Once downloaded, view the Anaconda installation file by accessing the Downloads folder using the following command:
$ cd Downloads/
~/Downloads$ ls
Anaconda3-4.4.0-Linux-x86_64.sh
  1. Once in the Downloads folder, initiate the installation for Anaconda by executing the following command:
~/Downloads$ bash Anaconda3-4.4.0-Linux-x86_64.sh 
Welcome to Anaconda3 4.4.0 (by Continuum Analytics, Inc.)
In order to continue the installation process, please review the license agreement.
Please, press ENTER to continue
Please note that the version of Anaconda, as well as any other software installed, may differ as newer updates are released to the public. The version of Anaconda that we are using in this chapter and in this book can be downloaded from https://repo.continuum.io/archive/Anaconda3-4.4.0-Linux-x86.sh
  1. Once the Anaconda installation is complete, restart the Terminal application to confirm that Python 3 is now the default Python environment through Anaconda by executing python --version in the terminal:
$ python --version
Python 3.6.1 :: Anaconda 4.4.0 (64-bit)
  1. The Python 2 version is still available under Linux, but will require an explicit call when executing a script, as seen in the following command:
~$ python2 --version
Python 2.7.12
  1. Visit the following website to begin the Spark download and installation process:

https://spark.apache.org/downloads.html

  1. Select the download link. The following file will be downloaded to the Downloads folder in Ubuntu:

spark-2.2.0-bin-hadoop2.7.tgz

  1. View the file at the terminal level by executing the following commands:
$ cd Downloads/
~/Downloads$ ls
spark-2.2.0-bin-hadoop2.7.tgz
  1. Extract the tgz file by executing the following command:
~/Downloads$ tar -zxvf spark-2.2.0-bin-hadoop2.7.tgz
  1. Another look at the Downloads directory using ls shows both the tgz file and the extracted folder:
~/Downloads$ ls
spark-2.2.0-bin-hadoop2.7 spark-2.2.0-bin-hadoop2.7.tgz
  1. Move the extracted folder from the Downloads folder to the Home folder by executing the following command:
~/Downloads$ mv spark-2.2.0-bin-hadoop2.7 ~/
~/Downloads$ ls
spark-2.2.0-bin-hadoop2.7.tgz
~/Downloads$ cd
~$ ls
anaconda3 Downloads Pictures Templates
Desktop examples.desktop Public Videos
Documents Music spark-2.2.0-bin-hadoop2.7
  1. Now, the spark-2.2.0-bin-hadoop2.7 folder has been moved to the Home folder, which can be viewed when selecting the Files icon on the left-hand side toolbar, as seen in the following screenshot:
  1. Spark is now installed. Initiate Spark from the terminal by executing the following script at the terminal level:
~$ cd ~/spark-2.2.0-bin-hadoop2.7/
~/spark-2.2.0-bin-hadoop2.7$ ./bin/pyspark
  1. Perform a final test to ensure Spark is up and running at the terminal by executing the following command to ensure that the SparkContext is driving the cluster in the local environment:
>>> sc
<SparkContext master=local[*] appName=PySparkShell>

How it works...

This section explains the reasoning behind the installation process for Python, Anaconda, and Spark.

  1. Spark runs on the Java virtual machine (JVM), the Java Software Development Kit (SDK) is a prerequisite installation for Spark to run on an Ubuntu virtual machine.
In order for Spark to run on a local machine or in a cluster, a minimum version of Java 6 is required for installation.
  1. Ubuntu recommends the sudo apt install method for Java as it ensures that packages downloaded are up to date. 
  2. Please note that if Java is not currently installed, the output in the terminal will show the following message:
    The program 'java' can be found in the following packages:
    * default-jre
    * gcj-5-jre-headless
    * openjdk-8-jre-headless
    * gcj-4.8-jre-headless
    * gcj-4.9-jre-headless
    * openjdk-9-jre-headless
    Try: sudo apt install <selected package>
    1. While Python 2 is fine, it is considered legacy Python. Python 2 is facing an end of life date in 2020; therefore, it is recommended that all new Python development be performed with Python 3, as will be the case in this publication. Up until recently, Spark was only available with Python 2. That is no longer the case. Spark works with both Python 2 and 3. A convenient way to install Python 3, as well as many dependencies and libraries, is through Anaconda. Anaconda is a free and open source distribution of Python, as well as R. Anaconda manages the installation and maintenance of many of the most common packages used in Python for data science-related tasks.
    2. During the installation process for Anaconda, it is important to confirm the following conditions: 

      • Anaconda is installed in the /home/username/Anaconda3 location
      • The Anaconda installer prepends the Anaconda3 install location to a PATH in /home/username/.bashrc
    1. After Anaconda has been installed, download Spark. Unlike Python, Spark does not come preinstalled on Ubuntu and therefore, will need to be downloaded and installed.
    2. For the purposes of development with deep learning, the following preferences will be selected for Spark:

      • Spark release: 2.2.0 (Jul 11 2017)
      • Package type: Prebuilt for Apache Hadoop 2.7 and later
      • Download type: Direct download
    3. Once Spark has been successfully installed, the output from executing Spark at the command line should look something similar to that shown in the following screenshot:
    1. Two important features to note when initializing Spark are that it is under the Python 3.6.1 | Anaconda 4.4.0 (64-bit) | framework and that the Spark logo is version 2.2.0.
    2. Congratulations! Spark is successfully installed on the local Ubuntu virtual machine. But, not everything is complete. Spark development is best when Spark code can be executed within a Jupyter notebook, especially for deep learning. Thankfully, Jupyter has been installed with the Anaconda distribution performed earlier in this section.

    There's more...

    You may be asking why we did not just use pip install pyspark to use Spark in Python. Previous versions of Spark required going through the installation process that we did in this section. Future versions of Spark, starting with 2.2.0 will begin to allow installation directly through the pip approach. We used the full installation method in this section to ensure that you will be able to get Spark installed and fully-integrated, in case you are using an earlier version of Spark.

    See also

    lock icon The rest of the chapter is locked
    Register for a free Packt account to unlock a world of extra content!
    A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
    Unlock this book and the full library FREE for 7 days
    Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
    Renews at $19.99/month. Cancel anytime
    Banner background image