Concurrent DBT Jobs

I’m not sure how concurrent jobs work on dbt core and dbt Cloud. I am running dbt core in airflow for a snowflake data warehouse. Some models update incrementally hourly, but also do full refreshes daily/weekly. My pool size is 1 on Airflow, but I would like to increase it if possible. Will increasing pool size and running jobs at the same time mess up my data?

How does this work on dbt cloud (are their safeguards in place somehow to allow for concurrent jobs without mucking up the data?)