Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Deep Learning with R for Beginners

You're reading from   Deep Learning with R for Beginners Design neural network models in R 3.5 using TensorFlow, Keras, and MXNet

Arrow left icon
Product type Course
Published in May 2019
Publisher Packt
ISBN-13 9781838642709
Length 612 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (4):
Arrow left icon
Mark Hodnett Mark Hodnett
Author Profile Icon Mark Hodnett
Mark Hodnett
Pablo Maldonado Pablo Maldonado
Author Profile Icon Pablo Maldonado
Pablo Maldonado
Joshua F. Wiley Joshua F. Wiley
Author Profile Icon Joshua F. Wiley
Joshua F. Wiley
Yuxi (Hayden) Liu Yuxi (Hayden) Liu
Author Profile Icon Yuxi (Hayden) Liu
Yuxi (Hayden) Liu
Arrow right icon
View More author details
Toc

Table of Contents (23) Chapters Close

Title Page
Copyright and Credits
About Packt
Contributors
Preface
1. Getting Started with Deep Learning 2. Training a Prediction Model FREE CHAPTER 3. Deep Learning Fundamentals 4. Training Deep Prediction Models 5. Image Classification Using Convolutional Neural Networks 6. Tuning and Optimizing Models 7. Natural Language Processing Using Deep Learning 8. Deep Learning Models Using TensorFlow in R 9. Anomaly Detection and Recommendation Systems 10. Running Deep Learning Models in the Cloud 11. The Next Level in Deep Learning 12. Handwritten Digit Recognition using Convolutional Neural Networks 13. Traffic Signs Recognition for Intelligent Vehicles 14. Fraud Detection with Autoencoders 15. Text Generation using Recurrent Neural Networks 16. Sentiment Analysis with Word Embedding 1. Other Books You May Enjoy Index

Index

A

  • activation functions / Neural networks as an extension of linear regression, Activation functions
  • activation layer / Extracting richer representation with CNNs
  • Adagrad / But what is a recurrent neural network, really?
  • Adaptive Boosting (AdaBoost) / How does deep learning become a state-of-the-art solution?
  • advanced deep learning text classification
    • about / Advanced deep learning text classification
    • 1-d convolutional neural network model / 1D convolutional neural network model
    • recurrent neural networks (RNN) / Recurrent neural network model
    • Long Short Term Memory model / Long short term memory model
    • Gated Recurrent Units (GR's) / Gated Recurrent Units model
    • bi-directional LSTM model / Bidirectional LSTM model
    • stacked bi-directional model / Stacked bidirectional model
    • bi-directional, with 1-d convolutional neural network model / Bidirectional with 1D convolutional neural network model
  • AlexNet / The ImageNet dataset
  • Amazon Machine Image (AMI) / A brief introduction to AWS, Creating a deep learning AMI in AWS
  • array.batch.size parameter / The array.batch.size parameter
  • artificial intelligence / What is deep learning?
  • artificial neural network (ANN) / What is deep learning and why do we need it?
  • auto-encoders
    • working / How do auto-encoders work?
    • regularized auto-encoders / Regularized auto-encoders
    • penalized auto-encoders / Penalized auto-encoders
    • denoising auto-encoders / Denoising auto-encoders
    • training, in R / Training an auto-encoder in R
    • features, accessing / Accessing the features of the auto-encoder model
    • used, for anomaly detection / Using auto-encoders for anomaly detection
  • autoencoder
    • example / Our first examples
    • about / Autoencoders and MNIST
    • credit card, fraud detection / Credit card fraud detection with autoencoders
    • exploratory data analysis / Exploratory data analysis
    • Keras / The autoencoder approach – Keras
    • fraud detection, with H2O / Fraud detection with H2O
  • AWS
    • using, for deep learning / Using AWS for deep learning
    • working / A brief introduction to AWS
    • virtual machine, using / A brief introduction to AWS
    • deep learning GPU instance, creating / Creating a deep learning GPU instance in AWS
    • deep learning AMI, creating / Creating a deep learning AMI in AWS
  • Azure
    • using, for deep learning / Using Azure for deep learning

B

  • backpropagation algorithm / Going from logistic regression to single-layer neural networks
  • backpropagation through time / But what is a recurrent neural network, really?
  • backward-propagation / Neural networks as an extension of linear regression
  • benchmark
    • about / Bag of words benchmark
    • data, preparing / Preparing the data
    • implementing / Implementing a benchmark – logistic regression 
  • bi-directional LSTM model / Bidirectional LSTM model
  • Bi-directional LSTM networks / Bi-directional LSTM networks
  • binary classification model / The binary classification model
    • improving / Improving the binary classification model

C

  • checkpoint package
    • reference / Setting up reproducible results
  • classification
    • about / What is deep learning?
    • with fashion MNIST dataset / Classification using the fashion MNIST dataset
  • classification and regression training (caret) / First attempt – logistic regression
  • co-occurrence matrix / GloVe
  • collaborative filtering
    • about / Use case – collaborative filtering
    • data, preparing / Preparing the data
    • model, building / Building a collaborative filtering model
    • deep learning model, building / Building a deep learning collaborative filtering model
    • deep learning model, applying / Applying the deep learning model to a business problem
  • comma-delimited (CSV) format / Data download and exploration
  • computation graph
    • example / Introduction to the MXNet deep learning library
  • Computer Vision and Pattern Recognition (CVPR) / How does deep learning become a state-of-the-art solution?
  • contrastive divergence algorithm / Deep neural networks
  • convolutional layer / Extracting richer representation with CNNs
  • convolutional layers
    • about / Convolutional layers
    • pooling layers / Pooling layers
    • dropout / Dropout
    • flatten layers / Flatten layers, dense layers, and softmax
    • softmax layer / Flatten layers, dense layers, and softmax
    • dense layers / Flatten layers, dense layers, and softmax
  • convolutional neural networks (CNNs)
    • about / Deep neural networks, MXNet, CNNs
    • with TensorFlow / Convolutional neural networks using TensorFlow
    • used, for handwritten digit recognition / Handwritten digit recognition using CNNs
    • used, for German Traffic Sign Recognition Benchmark (GTSRB) / Traffic sign recognition using CNN
    • MXNet, used / First solution – convolutional neural networks using MXNet
    • Keras, used with TensorFlow in CNNs / Trying something new – CNNs using Keras with TensorFlow
    • methods, reviewing to prevent overfitting / Reviewing methods to prevent overfitting in CNNs
  • corpus / Comparing traditional text classification and deep learning
  • cost function / Neural networks as an extension of linear regression
  • credit card
    • fraud detection, with autoencoder / Credit card fraud detection with autoencoders
  • cross-entropy method / RNN without derivatives — the cross-entropy method
  • ctx parameter / The symbol, X, y, and ctx parameters
  • customer lifetime value (CLV) / Evaluating performance

D

  • 1-d convolutional neural network model / 1D convolutional neural network model
  • 2D example / A simple 2D example
  • data
    • preparing / Preparing the data
  • data augmentation
    • about / Data augmentation
    • training data, increasing / Using data augmentation to increase the training data
    • test time augmentation / Test time augmentation
    • using, in deep learning libraries / Using data augmentation in deep learning libraries
    / Dealing with a small training set – data augmentation
  • data cleansing / The importance of data cleansing
  • data distributions / Different data distributions
  • data exploration
    • about / Warm-up – data exploration
    • tidy text, working / Working with tidy text
    • n-grams, calculating instead of single words / The more, the merrier – calculating n-grams instead of single words
  • data leakage / Data leakage
  • data partitioning / Data partition between training, test, and validation sets
  • data preparation / Data preparation
  • data preprocessing / Data preprocessing
  • dataset
    • URL, for downloading / Exploratory data analysis
  • deep belief networks (DBNs) / Deep neural networks
  • deep feedforward neural networks / Getting started with deep feedforward neural networks
  • deep learning
    • about / What is deep learning?, Back to deep learning, What makes deep learning special?
    • myths / Some common myths about deep learning
    • versus traditional text classification / Comparing traditional text classification and deep learning
    • local computer, setting up for / Setting up a local computer for deep learning
    • AWS, using for / Using AWS for deep learning
    • Azure, using for / Using Azure for deep learning
    • Google Cloud, using for / Using Google Cloud for deep learning
    • Paperspace, using for / Using Paperspace for deep learning
    • reinforcement learning / Reinforcement learning
    • need for / What is deep learning and why do we need it?
    • applications / What are the applications of deep learning?
    • applying, in self-driving cars / How is deep learning applied in self-driving cars?
    • state-of-the-art solution / How does deep learning become a state-of-the-art solution?
  • deep learning AMI
    • creating, in AWS / Creating a deep learning AMI in AWS
  • deep learning frameworks, R
    • MXNet / MXNet
    • Keras / Keras
  • deep learning GPU instance
    • creating, in AWS / Creating a deep learning GPU instance in AWS
  • deep learning layers / Deep learning layers
  • deep learning libraries
    • data augmentation, using / Using data augmentation in deep learning libraries
  • deep learning model
    • building / Building a deep learning model
    • base model / Base model (no convolutional layers)
  • deep learning networks
    • visualizing, TensorBoard used / Using TensorBoard to visualize deep learning networks
  • deep learning NLP architectures
    • comparing / Comparing the deep learning NLP architectures
  • deep learning text classification / Deep learning text classification
  • deep neural network (DNN) / Deep neural networks
  • dense layer / Deep learning layers
  • dimensionality-reduction / Building neural network models
  • document classification / Document classification
  • downsampling layer / Extracting richer representation with CNNs
  • dropout
    • used, for improving out-of-sample model performance / Use case – improving out-of-sample model performance using dropout
    / Reducing overfitting with dropout

E

  • Emacs / Setting up your R environment
  • epoch.end.callback parameter / The epoch.end.callback parameter
  • eval.data parameter / The eval.metric and eval.data parameters
  • eval.metric parameter / The eval.metric and eval.data parameters
  • evaluation metrics
    • about / Evaluation metrics and evaluating performance
    • types / Types of evaluation metric
  • exploratory data analysis / Exploratory data analysis, Exploratory data analysis

F

  • fashion MNIST dataset
    • using, in classification / Classification using the fashion MNIST dataset
  • feature map / Extracting richer representation with CNNs
  • forward-propagation / Neural networks as an extension of linear regression
  • forward propagation / But what is a recurrent neural network, really?
  • fraud detection
    • with H2O / Fraud detection with H2O

G

  • Gated Recurrent Units (GR's) / Gated Recurrent Units model
  • Gated recurrent units (GRUs) / GRU
  • gates / LSTM
  • generative adversarial networks (GANs) / Other deep learning topics
  • German Traffic Sign Recognition Benchmark (GTSRB)
    • convolutional neural networks (CNNs), used / Traffic sign recognition using CNN
    • exploring / Getting started with exploring GTSRB
    • convolutional neural networks (CNNs), using MXNet / First solution – convolutional neural networks using MXNet
    • convolutional neural networks (CNNs), using Keras with TensorFlow / Trying something new – CNNs using Keras with TensorFlow
    • overfitting, reducing with dropout / Reducing overfitting with dropout
  • global vectors (GloVe) / GloVe
  • Google Cloud
    • using, for deep learning / Using Google Cloud for deep learning
  • Google Neural Machine Translation system (GNMT) / What are the applications of deep learning?
  • GPUs / Do I need a GPU (and what is it, anyway)?
  • graphical processing units (GPUs) / The ImageNet dataset
  • GRU networks / LSTM and GRU networks, GRU

H

  • H2O
    • installing / Installing H2O
  • h2o
    • reference / Deep learning frameworks for R
  • handwritten digit recognition
    • CNNs, used / Handwritten digit recognition using CNNs
  • hidden layer
    • about / What is deep learning and why do we need it?, Multi-layer perceptron
    • adding, to networks / Adding more hidden layers to the networks
  • histogram of oriented gradients (HOG) / What makes deep learning special?, How does deep learning become a state-of-the-art solution?
  • hyperparameters, tuning
    • about / Tuning hyperparameters
    • grid search / Grid search
    • random search / Random search

I

  • image classification
    • with MXNet library / Image classification using the MXNet library
  • image classification models
    • about / Image classification models
    • existing models, loading / Loading an existing model
  • image classification solution
    • building / Building a complete image classification solution
    • image data, creating / Creating the image data
    • deep learning model, building / Building the deep learning model
    • saved deep learning model, using / Using the saved deep learning model
  • ImageNet dataset / The ImageNet dataset
  • ImageNet Large Scale Visual Recognition Challenge (ILSVRC) / The ImageNet dataset
  • Infrastructure as a Service (IaaS) / A brief introduction to AWS
  • initializer parameter / The initializer parameter
  • integrated development environment (IDE) / Setting up your R environment
  • International Conference on Machine Learning (ICML) / What are the applications of deep learning?
  • Internet of Things (IoT) / What are the applications of deep learning?

K

  • kappa metric / Sentiment extraction
  • Keras
    • about / Keras, Getting ready, The autoencoder approach – Keras
    • reference / Keras
    • URL / Trying something new – CNNs using Keras with TensorFlow
    • used, with TensorFlow / Trying something new – CNNs using Keras with TensorFlow
    • used, for Recurrent neural networks (RNNs) / Trying something new – CNNs using Keras with TensorFlow, RNN using Keras
    • installing, for R / Installing Keras and TensorFlow for R
  • Kullback-Leibler divergence / Variational Autoencoders

L

  • L1 penalty
    • about / Using regularization to overcome overfitting, L1 penalty
    • working / L1 penalty in action
  • L2 penalty
    • about / Using regularization to overcome overfitting, L2 penalty
    • working / L2 penalty in action
    • in neural networks / Weight decay (L2 penalty in neural networks)
  • Lasso / Using regularization to overcome overfitting
  • Latent Dirichlet Allocation (LDA) / What makes deep learning special?
  • latent variables / Variational Autoencoders
  • layer / What is deep learning?
  • learning rate / Neural networks as an extension of linear regression
  • least-absolute shrinkage and selection operator / L1 penalty
  • LeNet
    • about / Image classification using the MXNet library
    • model, creating / LeNet
  • Light Detection and Ranging (LiDAR) / How is deep learning applied in self-driving cars?
  • LIME (Local Interpretable Model-Agnostic Explanations) / Some common myths about deep learning
  • linear regression
    • with TensorFlow / Linear regression using TensorFlow
  • local computer
    • setting up, for deep learning / Setting up a local computer for deep learning
  • Local Interpretable Model-Agnostic Explanations (LIME)
    • using, for interpretability / Use case—using LIME for interpretability
    • model interpretability / Model interpretability with LIME
  • logistic regression
    • about / First attempt – logistic regression
    • to single-layer neural networks / Going from logistic regression to single-layer neural networks
  • long short-term memory networks (LSTMs) / MXNet, Long short term memory model
  • LSTM / LSTM
  • LSTM architectures / Other LSTM architectures
  • LSTM networks / LSTM and GRU networks, LSTM

M

  • machine learning / What is deep learning?
  • Market Basket Analysis / Use case – collaborative filtering
  • Mean Absolute Error (MAE) / Types of evaluation metric
  • Mean Squared Error (MSE) / Types of evaluation metric
  • memory / A simple benchmark implementation
  • metrics, in Keras
    • reference / Types of evaluation metric
  • metrics, in MXNet
    • reference / Types of evaluation metric
  • MNIST
    • about / What are the applications of deep learning?, Autoencoders and MNIST
    • exploring / Get started with exploring MNIST
    • outlier detection / Outlier detection in MNIST, Outlier detection in MNIST
  • model interpretability
    • in LIME / Model interpretability with LIME
  • Momentum / Adding more hidden layers to the networks
  • MSE (mean-squared error) / Neural networks as an extension of linear regression
  • MXNet
    • about / MXNet, Introduction to the MXNet deep learning library
    • using, for classification / Use case – using MXNet for classification and regression
    • using, for regression / Use case – using MXNet for classification and regression
    • data download / Data download and exploration
    • data exploration / Data download and exploration
    • data, preparing for models / Preparing the data for our models
    • binary classification model / The binary classification model
    • regression model / The regression model
    • image classification / Image classification using the MXNet library
    • used, for convolutional neural networks (CNNs) / First solution – convolutional neural networks using MXNet

N

  • National Institute of Standards and Technology (NIST) / What are the applications of deep learning?
  • natural language processing (NLP) / What are the applications of deep learning?, Word embeddings
  • neural network code / Neural network code
  • neural network models
    • building / Building neural network models
  • neural networks
    • about / A conceptual overview of neural networks
    • as extension of linear regression / Neural networks as an extension of linear regression
    • example / Neural networks as an extension of linear regression
    • as network of memory cells / Neural networks as a network of memory cells
    • in R / Neural networks in R
    • predictions, generating from / Generating predictions from a neural network
    • building / Use case – building and applying a neural network
    • applying / Use case – building and applying a neural network
    • building, in R / Building neural networks from scratch in R
    / Vector embeddings and neural networks
  • neural network web application / Neural network web application
  • nnet package / Setting up your R environment
  • non-linear layer / Extracting richer representation with CNNs
  • num.round parameter / The num.round and begin.round parameters

O

  • optimizer parameter / The optimizer parameter
  • ordinary least squares (OLS) / L1 penalty
  • out-of-sample model performance
    • improving, dropout used / Use case – improving out-of-sample model performance using dropout
  • overfitting
    • overcoming, regularization used / Using regularization to overcome overfitting
    • reducing, with dropout / Reducing overfitting with dropout
  • overfitting data
    • issue / The problem of overfitting data – the consequences explained

P

  • Paperspace
    • using, for deep learning / Using Paperspace for deep learning
  • performance
    • evaluating / Evaluation metrics and evaluating performance, Evaluating performance
  • Platform as a Service (PaaS) / A brief introduction to AWS
  • polynomial regression / Neural networks as an extension of linear regression
  • pooling layer / Extracting richer representation with CNNs
  • portable pixmap (PPM) / Getting started with exploring GTSRB
  • precision-recall curve (AUC) / Credit card fraud detection with autoencoders
  • predictions
    • generating, from neural network / Generating predictions from a neural network
  • principal component analysis (PCA) / Building neural network models, What is unsupervised learning?, What makes deep learning special?

R

  • R
    • reference / Setting up your R environment
    • deep learning frameworks / Deep learning frameworks for R
    • neural networks, building / Building neural networks from scratch in R
  • R6 class
    • perceptron / Perceptron as an R6 class
    • logistic regression / Logistic regression
    • multi-layer perceptron / Multi-layer perceptron
  • receiver-operator characteristic (ROC) / Credit card fraud detection with autoencoders
  • receptive field / Extracting richer representation with CNNs
  • rectified linear unit (ReLU) / Adding more hidden layers to the networks
  • recurrent neural network
    • about / What is so exciting about recurrent neural networks?, But what is a recurrent neural network, really?
    • LSTM networks / LSTM and GRU networks
    • GRU networks / LSTM and GRU networks
  • recurrent neural network (RNN) / Deep neural networks, Recurrent neural network model
  • Recurrent neural networks (RNNs)
    • about / What makes deep learning special?
    • from scratch in R / RNNs from scratch in R
    • implementing / Implementing a RNN
    • implementing, using R6 class / Implementation as an R6 class
    • implementing, without R6 class / Implementation without R6
    • without derivatives / RNN without derivatives — the cross-entropy method
    • Keras, used / RNN using Keras
    • benchmark implementation / A simple benchmark implementation
    • new text, generating from old text / Generating new text from old
  • regions of interest (ROI) / Getting started with exploring GTSRB
  • regression / What is deep learning?, Neural networks as an extension of linear regression
  • regularization
    • used, for overcoming overfitting / Using regularization to overcome overfitting
    • ensembles / Ensembles and model-averaging
    • model-averaging / Ensembles and model-averaging
  • reinforcement learning / Reinforcement learning
  • relu activation / Activation functions
  • Remote Desktop Protocol (RDP) / Using Azure for deep learning
  • R environment
    • setting up / Setting up your R environment
  • reproducible results
    • setting up / Setting up reproducible results
  • Reuters dataset / The Reuters dataset
  • RFM analysis / Use case – collaborative filtering
  • richer representation
    • extracting, with CNNs / Extracting richer representation with CNNs
  • Ridge / Using regularization to overcome overfitting
  • ridge regression / L2 penalty
  • R language
    • RNNs, from scratch / RNNs from scratch in R
    • classes, with R6 / Classes in R with R6
  • Root Mean Squared Error (RMSE) / Types of evaluation metric
  • R Shiny / Neural network web application
  • R Shiny web application
    • example / Setting up your R environment
  • RStudio
    • reference / Setting up your R environment
    • using / Setting up your R environment
  • RStudio IDE / Setting up your R environment

S

  • Scale Invariant Feature Transform (SIFT) / What makes deep learning special?, How does deep learning become a state-of-the-art solution?
  • sentiment analysis
    • about / Sentiment analysis from movie reviews
    • data preprocessing / Data preprocessing
    • words, to vectors / From words to vectors
    • sentiment extraction / Sentiment extraction
    • data cleansing / The importance of data cleansing
    • vector embeddings / Vector embeddings and neural networks
    • neural networks / Vector embeddings and neural networks
    • vector embedding / Vector embeddings and neural networks
    • Bi-directional LSTM networks / Bi-directional LSTM networks
    • LSTM architectures / Other LSTM architectures
    • mining, from Twitter / Mining sentiment from Twitter
    • Twitter API, connecting / Connecting to the Twitter API
    • model, building / Building our model
  • Software as a Service (SaaS) / A brief introduction to AWS
  • Speeded Up Robust Features (SURF) / What makes deep learning special?, How does deep learning become a state-of-the-art solution?
  • stacked bi-directional model / Stacked bidirectional model
  • standardization / Standardization
  • stochastic gradient descent (SGD) / Trying something new – CNNs using Keras with TensorFlow
  • Stuttgart Neural Network Simulator (SNNS) / Setting up your R environment, Building neural network models
  • supervised learning / What is deep learning?
  • Support Vector Machine (SVM) / What are the applications of deep learning?, How does deep learning become a state-of-the-art solution?
  • symbol parameter / The symbol, X, y, and ctx parameters

T

  • TensorBoard
    • used, for visualizing deep learning networks / Using TensorBoard to visualize deep learning networks
  • TensorFlow
    • about / Introduction to the TensorFlow library, Getting ready
    • models / TensorFlow models
    • linear regression / Linear regression using TensorFlow
    • Convolutional neural networks / Convolutional neural networks using TensorFlow
    • runs package / TensorFlow runs package
    • used, in CNNs with Keras / Trying something new – CNNs using Keras with TensorFlow
    • URL / Trying something new – CNNs using Keras with TensorFlow
    • installing, for R / Installing Keras and TensorFlow for R
  • TensorFlow estimators / TensorFlow estimators
  • TensorFlow models
    • deploying / Deploying TensorFlow models
  • tensors / Introduction to the MXNet deep learning library, Introduction to the TensorFlow library
  • test time augmentation (TTA) / Using data augmentation to increase the training data, Test time augmentation
  • text fraud detection
    • about / Text fraud detection
    • unstructured text data, to matrix / From unstructured text data to a matrix
    • text, to matrix representation / From text to matrix representation — the Enron dataset
    • autoencoder, on matrix representation / Autoencoder on the matrix representation
  • tf-idf (term frequency, inverse document frequency) / Comparing traditional text classification and deep learning
  • tidy text
    • working / Working with tidy text
  • tokens / Word vectors
  • traditional text classification
    • about / Traditional text classification
    • versus deep learning / Comparing traditional text classification and deep learning
  • trained model
    • using / Using a trained model
  • training data
    • increasing, data augmentation used / Using data augmentation to increase the training data
  • training run / TensorFlow runs package
  • transfer learning / Transfer learning
  • Twitter API
    • connecting / Connecting to the Twitter API

U

  • undercomplete auto-encoder / How do auto-encoders work?
  • unigrams / Data preprocessing
  • unstructured text data
    • to matrix representation / From unstructured text data to a matrix
  • unsupervised learning / What is deep learning?, What is unsupervised learning?

V

  • Variational Autoencoders (VAE)
    • about / Variational Autoencoders
    • used, for image reconstruction / Image reconstruction using VAEs
    • outlier detection, in MNIST / Outlier detection in MNIST
  • vector embedding / Vector embeddings and neural networks
  • vector embeddings / Vector embeddings and neural networks
  • vectors
    • to words / From words to vectors

W

  • weight decay / Weight decay (L2 penalty in neural networks)
  • word2vec / word2vec
  • word embeddings
    • about / Word embeddings
    • word2vec / word2vec
    • GloVe / GloVe
  • words
    • to vectors / From words to vectors
  • word vectors / Word vectors
lock icon The rest of the chapter is locked
arrow left Previous Section
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image