Referring to this documention : here.
It mentions, variable substitution can be done for databricks bundle validate OR databricks bundle deploy OR databricks bundle run. The important word here is: OR.
I have a job, which has named params, where I pass on variables, which have a default value. Now, I would like to trigger a run of that job, with a different variable value and not change the value for all runs.
This is part of my task definition:
tasks:
- task_key: ci_start
job_cluster_key: job_cluster
libraries:
- whl: ../../dist/*.whl
python_wheel_task:
package_name: ci_workflows
entry_point: ci_start
named_parameters:
postfix: ${var.postfix}
So, I am trying to substitute the value of postfix for just one particular run.
My variable declartion looks like this:
variables:
postfix:
description: Suffix used for integration tests during CI.
default: "init"
So, if I run
databricks bundle deploy
databricks bundle run --var="postfix=abc" my_job
I still see the triggered run with the var postfix
as init
, which is the default value in the variable declaration, and not what was passed in the databricks bundle run command.
If however, I substitute the var value in databricks bundle deploy and then trigger databricks bundle run (even without specifying var, it then gets the new var value)
i.e.
databricks bundle deploy --var="postfix=abc"
followed by
databricks bundle run my_job
the value of postfix
becomes abc
.
But the problem is- this changes the default variable value in the job and not of a particular run of a job. I would like to retain the original default value of postfix (init) for all other runs (where value of postfix is not supplied), and only like to override this in a particular run
Is it not possible to trigger the run with a different variable value without changing the job definition? The documentation says **OR**
, so it should be possible to do so.
what am I missing?