The problem
After setting up Unity Catalog and a managed Volume, I can upload files to the volume and download files from the volume on Databricks Workspace UI.
However, I cannot access the volume from notebook. I created an All-purpose compute, and run dbutils.fs.ls("/Volumes/catalog1/schema1/volumn11")
. Then I got the error
Operation failed: “This request is not authorized to perform this
operation.”, 403, GET
How we set up Unity Catalog and Managed Volume
- I am the Azure Databricks Account Admin, Metastore Admin, and Workspace Admin
- I created an Azure Databricks Workspace (Premium Tier)
- I created a Databricks Metastore, named
metastore1
- I created an Azure ADSL Gen2 (storage account with Hierarchical namespace enabled), named
adsl_gen2_1
- I created an Azure Access Connector for Azure Databricks (as an Azure Managed Identity), named
access_connector_for_dbr_1
- In the
adsl_gen2_1
, I assigned the rolesStorage Blob Data Contributor
andStorage Queue Data Contributor
to theaccess_connector_for_dbr_1
- I created two ADSL Gen2 containers under
adsl_gen2_1
- One named
adsl_gen2_1_container_catalog_default
- Another one named
adsl_gen2_1_container_schema1
- One named
- I created a Databricks Storage Credentials, named
dbr_strg_cred_1
- The
connector id
is the resource id ofaccess_connector_for_dbr_1
- The Permissions of the Storage Credentials were not set (empty)
- The
- I created two Databricks External Locations, both use the
dbr_strg_cred_1
- One external location named
dbr_ext_loc_catalog_default
, points to the ADSL Gen2 Containeradsl_gen2_1_container_catalog_default
, and the Permissions of this External Location were not set (empty) - Another one named
dbr_ext_loc_schema1
, points to the ADSL Gen2 Containeradsl_gen2_1_container_schema1
, and the Permissions of this External Location were not set (empty)
- One external location named
- I created a Databricks Catalog, named
catalog1
, undermetastore1
, and setdbr_ext_loc_catalog_default
as this catalog’s Storage Location - I created a Databricks Schema, named
schema1
, undercatalog1
, and setdbr_ext_loc_schema1
as this schema’s Storage Location - I created a Databricks Volume, named
volumn11
, underschema1
. - On Databricks UI, I can upload files to the volume and download files from the
volume11
- However, when I created an All-purpose compute, and run the below Python codes, I always got the error “Operation failed: “This request is not authorized to perform this operation.”, 403, GET”.
dbutils.fs.ls("/Volumes/catalog1/schema1/volumn11")
dbutils.fs.ls("dbfs:/Volumes/catalog1/schema1/volumn11")
spark.read.format("csv").option("header","True").load("/Volumes/catalog1/schema1/volumn11/123.csv")
spark.read.format("csv").option("header","True").load("dbfs:/Volumes/catalog1/schema1/volumn11/123.csv")
Details about the All-purpose compute
- Type: Single node
- Access mode: Single user
- Single user access: myself
- Runtime version: 14.3 LTS
- Enable credential passthrough for user-level data access: disabled