Deploying your models for batch scoring in Kubernetes
We will use Kubernetes to deploy our batch scoring job. We will need to do some modifications to make it conform to the Docker format acceptable to the MLflow deployment in production through Kubernetes. The prerequisite of this section is that you have access to a Kubernetes cluster or can set up a local one. Guides for this can be found at https://kind.sigs.k8s.io/docs/user/quick-start/ or https://minikube.sigs.k8s.io/docs/start/.
You will now execute the following steps to deploy your model from the registry in Kubernetes:
- Prerequisite: Deploy and configure
kubectl
(https://kubernetes.io/docs/reference/kubectl/overview/) and link it to your Kubernetes cluster. - Create a Kubernetes backend configuration file:
{ "kube-context": "docker-for-desktop", "repository-uri": "username/mlflow-kubernetes-example", "kube-job-template-path"...