How to configure environment variables for dbtoperator

The problem I’m having

Can’t run dbt model in airflow due to missing environment variables.

The context of why I’m trying to do this

Setup of environment variables in airfow dag using dbtoperator(GitHub - gocardless/airflow-dbt: Apache Airflow integration for dbt)

What I’ve already tried

  1. using env in the default _args

Some example code or error messages

from airflow import DAG
from airflow.operators.bash import BashOperator
from airflow_dbt.operators.dbt_operator import DbtRunOperator

default_args = {
'dir': '/opt/airflow/dbt',
'profiles_dir': '/opt/airflow/dbt',
'owner': 'airflow',
}

with DAG(dag_id='example', default_args=default_args, schedule_interval=None) as dag:


    dbt_run= DbtRunOperator(
    task_id='dbt_run',
    select='/opt/airflow/dbt/models/core/example.sql'
    )
    
    dbt_run

Hey @III

Based on airflow-dbt documentation, operators accept multiple arguments, including target.

Have you tried this:

    dbt_run= DbtRunOperator(
        task_id='dbt_run',
        select='/opt/airflow/dbt/models/core/example.sql',
        target=[YOUR_TARGET_NAME]
    )

Thanks, this can now be marked as resolved.

Hey, I would not recommend to Use Operators serviced by gocardless as this repo is not managed upto date. Most of the latest dbt cli flags are not supported

Regards
Minhaj

Thanks Minhaj, can you please recommend an alternative for dbt_core?

Hey,
Easy option: Run the dbt commands from Bash Operator to start with.
-If you are on airflow composer then would recommend running dbt over kubernatespodoperator
-not yet ready for production workloads, but you may try this option too.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.