Terraform
Databricks supports workflows in Terraform, and it’s a very viable way to deploy and change your workflows.
Here is how you can define a workflow, also called a job in some interfaces. You must set your workflow name and the resource name. After that, you must define tasks within the workflow:
resource "databricks_job" "my_pipeline_1" { name = "my_awsome_pipeline" Â Â Â task { .... Â Â Â Â Â Â Â existing_cluster_id = <cluster-id> Â Â Â } Â Â Â Â Â Â task { .... Â Â Â Â Â Â Â existing_cluster_id = <cluster-id> Â Â Â } }
Failed runs
When your workflows fail, you have the option to repair your run. You don’t need to rerun the whole pipeline, and Workflows is smart enough to just run your failed steps. This brings up the important topic of creating idempotent steps in a workflow. In short, if you...