I have a databricks workspace where I have mounted my azure storage containers but after enabling unity catalog I am unable to list the mount points using dbutils.fs.ls(‘/mnt/’) unless until I have the admin access.
I have multiple scripts which are automated using adf so now I can’t change the mount point to volumes
I have tried the below code to list the mount points but throwing below exception:
dbutils.fs.ls('/mnt/')
ExecutionError: An error occurred while calling o438.ls.
org.apache.spark.SparkSecurityException: [INSUFFICIENT_PERMISSIONS] Insufficient privileges:
User does not have permission SELECT on any file. SQLSTATE: 42501
at com.databricks.sql.acl.Unauthorized.throwInsufficientPermission
Error:
ExecutionError: An error occurred while calling o438.ls. org.apache.spark.SparkSecurityException: [INSUFFICIENT_PERMISSIONS] Insufficient privileges: User does not have permission SELECT on any file. SQLSTATE: 42501 at com.databricks.sql.acl.Unauthorized.throwInsufficientPermission
The above ERROR message suggest you are not having sufficient permissions.
To make sure that your user account is assigned with required roles assigned you can follow the below steps:
The below is my mount script:
configs={"fs.azure.account.auth.type":"OAuth", "fs.azure.account.oauth.provider.type":"org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id":"<Your client ID>", "fs.azure.account.oauth2.client.secret":dbutils.secrets.get(scope="<YOUR DATABRICKS SECRET SCOPE NAME >", key="<KEY VAULT SECRET NAME>"), "fs.azure.account.oauth2.client.endpoint":"https://login.microsoftonline.com/<Your tenant ID>/oauth2/token"}
dbutils.fs.mount(source= "abfss://[email protected]",
mount_point="/mnt/raw",
extra_configs=configs)
I have assigned the KEY VAULT Administrator
role to SPN & AZUREDATABRICKS app.
I have assigined Storage Blob Data Contributor
role to SPN & AZUREDATABRICKS app.
Results:
dbutils.fs.ls('/mnt/raw/')
[FileInfo(path='dbfs:/mnt/raw/sample_employee_data.csv', name='sample_employee_data.csv', size=813, modificationTime=1729745074000)]
5