I am trying to do the astronomer-cosmos package in AWS MWAA. The set up is fine and cosmos package is installed as mentioned in requirements.txt file.
I maintain the snowflake credentials in aWS secret manager. In dbt profile.yml, passing the connection parameter as the env_vars. I dont want to maintain credentials in Airflow connections.
profiles.yml
dbt_project:
outputs:
dev:
type: snowflake
account: "{{ env_var('SNOWFLAKE_ACCOUNT') }}"
user: "{{ env_var('SNOWFLAKE_USER') }}"
password: "{{ env_var('SNOWFLAKE_PASSWORD') }}"
role: "{{ env_var('SNOWFLAKE_ROLE') }}"
database: "{{ env_var('SNOWFLAKE_DATABASE') }}"
warehouse: "{{ env_var('SNOWFLAKE_WAREHOUSE') }}"
schema: "{{ env_var('SNOWFLAKE_SCHEMA') }}"
threads: 1
target: dev
Below is my dag code:
from datetime import datetime
import os
from cosmos import DbtDag, ProjectConfig, ProfileConfig, ExecutionConfig, RenderConfig
from pathlib import Path
dbt_project_path = Path("/usr/local/airflow/dags/dbt/dbt_project/")
profile_config = ProfileConfig(profile_name="dbt_project",
target_name="dev",
profiles_yml_filepath="/usr/local/airflow/dags/dbt/dbt_project/profiles.yml",)
dim = DbtDag(project_config=ProjectConfig(dbt_project_path,models_relative_path="models/"),
operator_args={"install_deps": True},
profile_config=profile_config,
execution_config=ExecutionConfig(dbt_executable_path=f"{os.environ['AIRFLOW_HOME']}/dbt_venv/bin/dbt",),
schedule_interval="@daily",
start_date=datetime(2023, 9, 10),
catchup=False,
dag_id="dim",
render_config = RenderConfig(select=["+path:models/dim/dim.sql"]))
Now I need to pass the credentials for the env_var by accessing the secret manage. I used a python script to get those details. But i am struck how to map those env_variables to the dag
Tried using python script to fetch the values from the secret manager and set as environment variables. but profile.yml is not using it.
Madhavan Murugan is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.