I have tried to editing job settings by hitting one of the APIs and tried to update it using the job id. Is it possible to update current cluster with a serverless cluster using python?
https://{databricks_host()}/api/2.1/jobs/get- i used this api to get current job settings.
https://{databricks_host()}/api/2.1/jobs/update – To update the current job settings.
Config that i used
new_cluster[‘spark_conf’] = new_cluster.get(‘spark_conf’, {})
new_cluster[‘spark_conf’][‘spark.databricks.cluster.profile’]= ‘serverless’
new_cluster[‘spark_conf’][‘spark.databricks.repl.allowedLanguages’] = ‘sql,python,r’