Executing multiple models via a single sql file

I am an ETL developer, new to dbt.

The problem I’m having

I want to execute multiple model files through single command.

The context of why I’m trying to do this

I am trying to build an ETL data pipeline using dbt,snowflake db and azure devops.
DevOps pipeline will start execution as per schedule.
Through dbt macro, I gather few variable’s value like countryname.
and using those, I need to execute around 500 model files.

So instead of writing ‘dbt run -s model1.sql IND, model2.sql IND’(Manually),
can I create a file(Let’s say data_load.sql) with all model names in it and then just execute
dbt run data_load.sql
from devops pipeline script.

What I’ve already tried

I have tried writing them in static format like below
image

Any help for this will be helpful.

you can tag the models and select the tag, or you can write a https://docs.getdbt.com/reference/node-selection/yaml-selectors|selector that selects the right model and use that, in the most complex case that’s basically the same as your request to have a file with the list of what to run

Note: @Mike Stanley originally posted this reply in Slack. It might not have transferred perfectly.

Hello @a_slack_user ,

This resolves my problem.
I just have one thought, this will not execute models in parallel ?
Also, can we mention any condition in it ?
like if var a>var b then

I’m not aware of a way inside dbt to decide to run models or not based on conditions, but you could write a script that decides what to run and then calls dbt.

It will execute as many models in parallel as you have set https://docs.getdbt.com/docs/running-a-dbt-project/using-threads|threads - the main reason to change this value is if your database doesn’t like the number of open connections or simultaneous queries being submitted

Note: @Mike Stanley originally posted this reply in Slack. It might not have transferred perfectly.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.