I’m rather new to airflow and couldn’t find information anywhere. Currently in my DAG i’m reading an existing Postgres connection configured in airflow thus taking the credentials for it and passing them to task that uses DatabricksSubmitNowOperator. I’m using existing cluster and i’m passing them as “base_parameters”. The process works fine but the problem is that when you open Databricks job details “base_parameters” are visible and credentials like login and password are exposed. I tried passing the credential variables as environmental variables but they are not accessible between tasks. TL DR: is there a way to read airflow connection credentials and pass them as variables to DatabricksSubmitNowOperator task without exposing them?
- I tried setting credentials as environmental variables but the databricks task can’t access them;
- I tried pushing them as temporary to databricks secrets but didn’t work;
qni dopi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.