dbt - fail all independent models in a folder even if 1 model fails

stg_models folder has 30 models which are independent of each other ( no reference between them)

stg_model_1.sql - with 5 columns as final output
stg_model_2.sql - with 15 columns as final output
stg_model_3.sql - with 55 columns as final output
:
:
stg_model_30.sql - with 100 columns as output

when dbt run -s models/stg_models/* is executed, all 30 models are run.
Requirement is, even if 1 model of these 30 models fails, other models should not run or
should fail.
Fail fast option in dbt doesnt work as successful models still runs even if 1 fails.
Any idea if this is possible in dbt cloud? I dont think we can specify transactions in dbt. We are using Snowflake.

Try changing the threads size to 1 and then use --fail-fast. But here also models will run one after other and lets say 4 models ran successfully and if the 5th one failed model 6-30 will not run. Does this work for you?

FYI: In dbt, “threads” in job settings refer to the number of models that dbt can build concurrently. By increasing the number of threads, you can parallelize node execution, potentially reducing the overall runtime of your dbt project. The default value is 4.

Note: @Manikanta Divvela originally posted this reply in Slack. It might not have transferred perfectly.

Thanks for the reply. But that doesn’t work. All or nothing approach required.