We use airflow to orchestrate our dbt runs.
Currently we have our entire pipeline run once a day. Though, we have requests to increase the frequency of the runs up to once every 15 mins for parts of the pipeline.
My question is if we have multiple dbt runs for various parts of the dbt pipeline running parallel orchestrated by Airflow, is there any chance of data loss or locks etc ? Are there any disadvantages to having multiple runs of dbt running at the same time ? What happens if the full dbt run runs at the same time as the smaller dbt run which runs every 15 mins ? We have some incremental models as well. Airflow is running in composer.
Our warehouse is Bigquery. Thanks.