I’m attempting to run a notebook on Azure Databricks using the Databricks REST API. I have a Python script that triggers the notebook execution based on a specific path, with authentication handled by a Service Principal (SPN).
I’ve granted the SPN access to the relevant repositories and cluster to ensure it has the necessary permissions to run the notebook. However, when I try to execute the notebook, I encounter the following error:
{
"error_code": "PERMISSION_DENIED",
"message": "User 'my-spn' does not have Manage Run or Owner or Admin permissions on job 246372968680205"
}
APIs Used:
Job Creation API:
URL: https://myurl.azuredatabricks.net/api/2.1/jobs/create
Payload:
{
"format": "MULTI_TASK",
"max_concurrent_runs": 1,
"name": "TEST-1",
"tasks": [
{
"notebook_task": {
"notebook_path": "/Workspace/Repos/test"
},
"task_key": "Test_1",
"existing_cluster_id" : "vattkymf"
}
],
"webhook_notifications": {}
}
This API creates a job and returns a job ID.
{
"job_id": 38022546101247
}
Job Execution API:
URL: https://myurl.azuredatabricks.net/api/2.1/jobs/run-now
Payload:
{
"job_id": 38022546101247
}
All APIs are using a Bearer token created with the SPN.
Issue:
Even though the SPN has been granted access to the cluster and repository, there’s still a permissions issue with the job itself. Despite double-checking all the permissions and paths, the issue persists.
I have verified that the SPN has access to the cluster, repositories, and the specific job, and I also ensured that the notebook path is correct. However, the error suggests that the SPN does not have sufficient permissions on the job itself.
Note : I have removed some sensitive data.