Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Learning Salesforce Einstein

You're reading from   Learning Salesforce Einstein Add artificial intelligence capabilities to your business solutions with Heroku, PredictiveIO, and Force

Arrow left icon
Product type Paperback
Published in Jun 2017
Publisher Packt
ISBN-13 9781787126893
Length 334 pages
Edition 1st Edition
Arrow right icon
Author (1):
Arrow left icon
Mohit Shrivatsava Mohit Shrivatsava
Author Profile Icon Mohit Shrivatsava
Mohit Shrivatsava
Arrow right icon
View More author details
Toc

Table of Contents (10) Chapters Close

Preface 1. Introduction to AI FREE CHAPTER 2. Role of AI in CRM and Cloud Applications 3. Building Smarter Apps Using PredictionIO and Heroku 4. Product Recommendation Application using PredicitionIO and Salesforce App Cloud 5. Salesforce Einstein Vision 6. Building Applications Using Einstein Vision and Salesforce Force.com Platform 7. Einstein for Analytics Cloud 8. Einstein and Salesforce IoT Cloud Platform 9. Measuring and Testing the Accuracy of Einstein

Practical machine learning with Google Prediction API and Salesforce

To understand machine learning concepts practically, we will build a simple Proof of Concept (PoC) demo that uses Google Prediction API, and we will apply the predictive results on the Salesforce records. The aim of this exercise is to help you understand the basic steps of machine learning without digging into minute details and to get some idea of how we can leverage external machine learning algorithms on Salesforce data and the power we add to the Salesforce data through machine learning.

Google offers a simple Prediction API. These are predefined models.

Google Prediction API general algorithms can be categorized as follows:

  • Given a new item, predict a numeric value for that item, based on similar valued examples in its training data (Regression Model)
  • Given a new item, choose a category that describes it best, given a set of similar categorized items in its training data (Categorical Model)

The Prediction API integration with Salesforce is covered in the Practical Machine Learning With Google Prediction API and Salesforce section.

Business scenario

Universal Container (fictitious company) wants to find the probability of opportunity closure using their existing data. We will help them integrate Google Predictive API with Salesforce to predict the probability of the Salesforce opportunity closure based on the expected revenue and the opportunity type, which in turn can help them forecast revenue better.

Let's take an opportunity dataset from Salesforce, upload it into the Google storage, and build a regression type predictive model, train the model, and use it to predict the probability of the opportunity closure.

Prerequisites

This section covers the steps required to experiment with Google Prediction API and Salesforce:

  1. You have Salesforce login credentials. If you do not have one, sign up at https://developer.salesforce.com/signup.
  1. Sign up for a Google account at https://accounts.google.com/SignUp?hl=en.
  1. Enable the Prediction API by visiting https://console.cloud.google.com/home/dashboard.
  1. Create a Google Cloud Project, as shown in the following screenshot. Once you create a project, note the Project ID as we will be using the Project ID in the API request:
The project creation screen in the Google console
  1. Sign up for a free Google cloud storage at https://console.cloud.google.com/storage/browser.
  1. Create a folder called salesforceeinstein and upload the provided CSV (The CSV is shared in the git repository located at (https://github.com/PacktPublishing/Learning-Salesforce-Einstein/blob/master/Chapter1/SFOpportunity.csv) in the Google Cloud storage. Name the file as SFOpportunity.csv:

  1. Open the prediction API explorer the (https://developers.google.com/apis-explorer/#s/prediction/v1.6/) to train the model via API. We will need to first enable OAuth for the project and use the right scope. The screenshot shows the OAuth 2.0 screen and scope enablement screen. You will need to select the auth/prediction checkbox:

  1. We will be using the v1.6 version of Prediction API. The training and prediction is covered in the next section.

Note that the CSV data here is a report extract of opportunity data from Salesforce. You can extract data using the Salesforce standard reporting interface. You will need to create a custom probability field on the Salesforce opportunity object to track the probability from the prediction API.

Check the following screenshot of the Dataset sample. The data samples can be taken from your Salesforce organization, and, in case you want to use the one used in this book, you can get it from the git repository (https://github.com/PacktPublishing/Learning-Salesforce-Einstein/blob/master/Chapter1/SFOpportunity.csv):

Training and prediction

The regression type predictive model uses a numeric answer for the question based on the previous samples. For this example, we will use this regression type predictive model. Take a look at the file located at (https://github.com/PacktPublishing/Learning-Salesforce-Einstein/blob/master/Chapter1/SFOpportunity.csv). The first column is probability, the second column is opportunity stage and the last column is the opportunity revenue. If you look carefully, you will notice that there is some correlation between the type of opportunity, expected revenue, and probability.

If you observe the dataset sample closely, you will see that for opportunities of type Existing customers, the higher the expected revenue, the more the probability.

To train the Dataset, we will leverage the Prediction API provided by Google. The complete set of APIs is listed in the following table:

API Description
prediction.hostedmodels.predict This submits an input and requests an output against a hosted model.
prediction.trainedmodels.analyze This gets an analysis of the model and the data the model was trained on.
prediction.trainedmodels.delete This deletes a trained model.
prediction.trainedmodels.get This checks the training status of your model.
prediction.trainedmodels.insert This trains a Prediction API model.
prediction.trainedmodels.list This lists available models.
prediction.trainedmodels.predict This submits a model ID and requests a prediction.
prediction.trainedmodels.update This adds new data to a trained model.

If you recall, machine learning consists of three steps:

  1. Load the sample dataset.
  2. Train the data.
  3. Use the generated function to predict the outcome for the new Dataset.

Integration architecture

The following diagram shows the Integration Architecture that we have adopted for experimenting with Google Prediction API and Salesforce Data:

Integration Architecture For Using Google Prediction API with Salesforce

The data will be loaded into the Google Cloud bucket and trained using Prediction API manually using the console, and a model is formed. We will submit the query to the formed model by triggering an HTTP request to the Google Prediction API from Salesforce. The architecture is purposefully kept very simple to help readers grasp the fundamentals before we approach Prediction API in detail.

If you have multiple Google accounts, it is recommended that you use the incognito mode of browser to experiment.

The following are the steps required to train the dataset via Google Prediction API. Note that we will use prediction.trainedmodels.insert to train the model:

  1. Load the Sample Dataset: At this point, the assumption is that you have extracted data out of the Salesforce opportunity object and loaded it into the Google Cloud Storage. The steps are covered in the prerequisites section.
  1. Training the Dataset: Let's use the API explorer to train the sample data using API. The API explorer of Google is located at https://developers.google.com/apis-explorer/#s/prediction/v1.6/.

The following screenshot shows how one can train the model:

The request and response JSON are shown as follows. Note that the learned-maker-155103 is the
Project ID. Replace with your current Project ID.

Request Payload:

        POST https://www.googleapis.com/prediction/v1.6/projects
/learned-maker-155103/trainedmodels
{
"modelType": "regression",
"id": "opportunity-predictor",
"storageDataLocation":
"salesforceeinstein/SFOpportunity.csv"
}
In the previous request JSON, the learned-marker-155103 is the Project ID of the Google project. For the ID value, one can choose anything and it is used to track the status of the model training

Carefully, note that we have pointed the location in Cloud Storage where our data resides. Once successful, we get a response from the API, which is shown as follows:

Response Payload:

        200
{
"kind": "prediction#training",
"id" : "opportunity-predictor",
"selfLink": "https://www.googleapis.com/prediction/v1.6
/projects/learned-maker-155103/trainedmodels/opportunity-
predictor", "storageDataLocation":
"salesforceeinstein/SFOpportunity.csv"
}

We can monitor the status of the data training via API prediction.trainedmodels.get.

The request to execute in the console is as follows:

Request Payload:

       GET https://www.googleapis.com/prediction/v1.6/
projects/learned-maker-155103/trainedmodels/opportunity-
predictor

Response Payload:

       200
{
"kind": "prediction#training",
"id": " opportunity-predictor",
"selfLink": "https://www.googleapis.com/prediction/v1.6/
projects/learned-maker-155103/trainedmodels/opportunity-
predict",
"created": "2017-01-18T19:10:27.752Z",
"trainingComplete": "2017-01-18T19:10:48.584Z",
"modelInfo": {
"numberInstances": "18",
"modelType": "regression",
"meanSquaredError": "79.61"
},
"trainingStatus": "DONE"
}

You will get a response showing the training status, which is shown here:

      DONE

Note that if your data does not have any correlation, then you will see a very high value of meanSquaredError.

  1. Using the trained Dataset to predict the outcome: For this, we will create a simple trigger on the opportunity object in Salesforce to make an asynchronous API call to invoke the Google Prediction API to predict the probability of opportunity closure.

Before we add Triggers to Salesforce, make sure that Predicted_Probability field is created in the Salesforce. To create a field in Salesforce follow the following steps:

  1. Navigate to SETUP | Object Manager | Opportunity in Lightning experience or Setup | Opportunities | Fields | New in Classic experience.
  2. Select the Field type as Number (with length as 5 and decimal places as 2) and follow the defaults and save:

The trigger code uses Apex, which is similar to JAVA provided by the Salesforce platform to write business logic. For the purposes of demonstration, we will keep the code very simple:

      //Trigger makes an API call to Google Prediction API 
to predict opportunity probability
//Please note that this trigger is written for demonstration
purpose only and not bulkified or batched
trigger opportunityPredictor on Opportunity (after insert) {
if(trigger.isinsert && trigger.isAfter){
OpportunityTriggerHelper.predictProbability
(trigger.new[0].Id);
}
}

If you are using Salesforce Classic, the navigation path to add that trigger an opportunity is SETUP | Customize | Opportunities | Triggers.

For Lightning experience, the path is SETUP | Triggers | Developer Console. Use Developer Console to create a trigger

Also note that since triggers use apex classes, first save the dependent apex classes before saving the apex trigger.

The Apex class that is invoked from the trigger is as follows:

SETUP | Develop | Apex Classes

      //Apex Class to make a Callout to Google Prediction API
public with sharing class opportunityTriggerHelper{
@future(callout=true)
public static void predictProbability(Id OpportunityId){
Opportunity oppData = [Select Id,Amount,Type,
Predicted_Probability__c from Opportunity where Id =
:OpportunityId];
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:Google_Auth');
req.setMethod('POST');
req.setHeader('content-type','application/json');
//Form the Body
PredictionAPIInput apiInput = new PredictionAPIInput();
PredictionAPIInput.csvData csvData =
new PredictionAPIInput.csvData();
csvData.csvInstance = new list<String>{
oppData.Type,String.valueof(oppData.Amount)};
apiInput.input = csvData;
Http http = new Http();
req.setBody(JSON.serialize(apiInput));
HTTPResponse res = http.send(req);
System.debug(res.getBody());
if(res.getStatusCode() == 200){
Map<String, Object> result = (Map<String, Object>)
JSON.deserializeUntyped(res.getBody());
oppData.Predicted_Probability__c =
Decimal.valueof((string)result.get('outputValue'));
update oppData;
}
}
}

The apex class for parsing the JSON is as follows:

       public class PredictionAPIInput {
public csvData input;
public class csvData {
public String[] csvInstance;
}
}

Salesforce Apex offers HTTP methods for us to make calls to an external API. We are leveraging that and the configuration in the named credential to make an HTTP request to the Google Prediction API.

Setting authentication for calling API from SFDC

To simplify the authentication, we will use a named credential in Salesforce and auth settings to keep it configurable. Note that this is not scalable, but it's a quick way to integrate Salesforce with Google for the purposes of demonstration only. You can use a service account or set up a full OAuth 2.0 if you are considering a scalable approach.

The steps to authorize Salesforce to access the Google Prediction API are as follows:

  1. Generate a client ID and client secret via the Google Auth screen. To generate this, navigate to the console URL (https://console.developers.google.com/apis/credentials). Choose the subtab Oauth consent screen and fill in the Product name shown to users as Salesforce and save the OAuth consent screen form.

    The following screenshot shows the OAuth consent screen and details one has to fill in to generate the Consumer Secret and Consumer Key:

Once an OAuth consent screen is created, create a web application and note the Consumer Secret and Consumer Key. This will be used in the Salesforce Auth. Provider screen:

  1. Use Auth in Salesforce to set up authorization, as shown in the following screenshot. The path for it is SETUP | Security Controls | Auth. Providers | New

Select the Provider Type as Google. The following screenshot shows the form. Note that we use the Consumer Key and Consumer Secret from step 1 as input to the Auth. Provider form:

The following screenshot shows the saved Auth. Provider record. Note the Callback URL as it will be fed back to the Google OAuth consent screen:

  1. Note down the Callback URL from the Auth. Provider screen. The Callback URL needs to be added back to the Google Auth screen. The following screenshot shows the redirect URL configuration in the Google OAuth screen:
  1. Use Named Credentials so that we avoid passing the authorization token. Carefully note the scope defined in the SETUP. The path for it in Classic is: Setup | Security Controls | Named Credentials:
  1. Testing out the results is the final step to test our model and use it with a real-time record creation screen in Salesforce. Create an opportunity record and indicate the type and the amount of opportunity. Once a Salesforce record is inserted, a trigger fires, calling the Prediction API with a new Dataset, and the result is stored back in the custom probability field.
Note that if you get an authentication error in the logs, go to Named Credentials to authorize the app again.

The API that the Apex trigger code hits is as follows:

https://www.googleapis.com/prediction/v1.6/projects/learned-maker-155103/trainedmodels/opportunity-predictor/predict

To test, create the Salesforce new record and add fields such as type and amount and monitor the custom probability field on the Salesforce opportunity record.

Note that we are using the custom field for probability. Also note that we have switched our Salesforce organization to use new Lightning Experience. Salesforce offers Classic and Lightning Experience. Classic is an older user interface, while Lightning Experience is a modern interface. To enable Lightning Experience in your developer org under the Setup menu you will find a menu item named Switch To Lightning Experience. You can always toggle between Lightning and Classic in your developer org.

To switch to the Salesforce Lightning experience, consider the following screenshot:

The following screenshot shows the results obtained in Lightning Experience, and, clearly, the Predicted_Probability field is populated:

Drawback of this approach

The preceding approach is not scalable, while it serves as a great experiment to understand the fundamentals of a machine learning process.

From the previous experiment, the following are the conclusions:

  • The prediction system uses a larger Dataset and, hence, considering data limits on the Salesforce platform, it's always better to have a big data server collecting the data and forming the model
  • There is a need to train the data periodically to get an appropriate model and keep it up-to-date
  • Machine learning uses statistical analysis under the hood. In this scenario, we are using the regression model. As an app developer, there is no need to really dig into mathematics although there is no harm in doing so
Google is planning to shut down its PredictionAPI services from June 2018. We have used it for experimental purposes only. Please check the article at https://cloud.google.com/prediction/docs/end-of-life-faq to learn more. Instead, for the future, Google has introduced cloud machine learning services. If you are planning to build machine learning tasks with Google API, you will prefer the machine learning services documented at https://cloud.google.com/products/machine-learning/.
You have been reading a chapter from
Learning Salesforce Einstein
Published in: Jun 2017
Publisher: Packt
ISBN-13: 9781787126893
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image