Extremely slow dbt run (15 min+) with Databricks Unity Catalog Schema with over 3500 tables

Extremely slow dbt run (15 min+) with Databricks Unity Catalog Schema with over 3500 tables.

Setup

python3.11
dbt-core=1.7.2
dbt-spark=1.7.1
dbt-databricks=1.7.1
dbt-sql-connector=2.9.3
Multi-node cluster running Databricks Runtime 12.2 LTS ML (Apache Spark 3.3.2, Scala 2.12)

Issue

When we dbt run our models, even before the actual queries themselves run it seems to parse the entirety of the schema tables, and although the model itself takes only 2 seconds to run, the parsing portion takes over 15 mins. Our schema has over 3500+ tables, and for governance purposes we do not have the ability to create our own dedicated schema as recommended by the dbt documentation.

We have tried some cache related CLI options such as --cache-selected-only, --select, and --no-populate-cache but it seems to have no impact.

Requested Outcome

What configurations can we make besides creating our own separate schema, in order to shorten this 15 min schema parsing?

Reference

For context I have been following this GitHub issue for some time, waiting for a solution:

Hello @takahiro.watanabe1
When you go through the dbt logs - are you able to see if it’s running a lot of describe extended queries? Also if you don’t mind opening an issue on the dbt-databricks adapter repo - this would be the best way for the Databricks team to triage issues.