Job parameterization through DBT api

The problem I’m having

I’d like to be able to inject variable values into a dbt model when triggering a job through the api but looking a the documentation for api version 1.7, that does not seem possible.

The context of why I’m trying to do this

I’ve got a large number of instances of an AWS batch job that run concurrently and I’d like to trigger a dbt model parameterized with associated ids once the batch job finishes.

What I’ve already tried

I understand that this is possible when running a job through the command line using --vars. I’m pretty much trying to replicate that functionality through a dbt api call.

3 Likes

We are also looking for a similar feature. I can see that someone has asked for this a year ago as well, here: Pass user input from Airflow to Workflow defined to run DBT job - #3 by Surya

This is extremely useful to be able to customise the logic of a dbt job based on certain dynamic inputs

You can use the steps_override option to dynamically pass variables to your dbt job. This option allows you to replace the default commands of the job with the specific commands provided in steps_override, ensuring the job executes those commands instead.

Currently, this feature is the only available solution to address your requirement.

account_id = 12345
job_id = 123456
end_point = f'https://cloud.getdbt.com/api/v2/accounts/{account_id}/jobs/{job_id}/run/'
api_key = 'xxxxxxxxx'
headersAuth = {
        'Authorization': 'Bearer '+ str(api_key),
    }

body = {
    'cause': "Triggered by Python Script",
  "steps_override": [
    "dbt run -s model1 model_2 --vars 'batch_id: 1234' "
  ]
}
res = requests.post(
        url=end_point,
        headers=headersAuth,
        json= body
    )
print(res)

Access the batch_id in your model using var() function. Ex:- {[var('batch_id')}}

Let me know if it works