III
January 24, 2023, 11:34am
1
The problem I’m having
Can’t run dbt model in airflow due to missing environment variables.
The context of why I’m trying to do this
Setup of environment variables in airfow dag using dbtoperator(GitHub - gocardless/airflow-dbt: Apache Airflow integration for dbt )
What I’ve already tried
using env in the default _args
Some example code or error messages
from airflow import DAG
from airflow.operators.bash import BashOperator
from airflow_dbt.operators.dbt_operator import DbtRunOperator
default_args = {
'dir': '/opt/airflow/dbt',
'profiles_dir': '/opt/airflow/dbt',
'owner': 'airflow',
}
with DAG(dag_id='example', default_args=default_args, schedule_interval=None) as dag:
dbt_run= DbtRunOperator(
task_id='dbt_run',
select='/opt/airflow/dbt/models/core/example.sql'
)
dbt_run
wefo
January 24, 2023, 1:27pm
2
Hey @III
Based on airflow-dbt documentation, operators accept multiple arguments, including target
.
Have you tried this:
dbt_run= DbtRunOperator(
task_id='dbt_run',
select='/opt/airflow/dbt/models/core/example.sql',
target=[YOUR_TARGET_NAME]
)
III
January 24, 2023, 3:42pm
3
Thanks, this can now be marked as resolved.
Hey, I would not recommend to Use Operators serviced by gocardless as this repo is not managed upto date. Most of the latest dbt cli flags are not supported
Regards
Minhaj
III
January 26, 2023, 4:41pm
5
Thanks Minhaj, can you please recommend an alternative for dbt_core?
Hey,
Easy option: Run the dbt commands from Bash Operator to start with.
-If you are on airflow composer then would recommend running dbt over kubernatespodoperator
-not yet ready for production workloads, but you may try this option too.
1 Like
system
Closed
February 3, 2023, 8:43am
7
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.