DBT removing column comments on views as a part of each dbt run


I have a couple of curated views on Snowflake that are re-created via dbt a couple of times each day via the dbt build process. For the columns existing in these views, I have a separate process on Snowflake which runs and updates the column definitions inside the <database_name>.information_schema.columns from a table that is maintained elsewhere inside the database. However, every time a dbt build runs, it wipes off these column definitions from the views, since they are being managed outside of dbt. Hence it does not have any knowledge of them.

I am looking for a way to not overwrite these column definitions every time dbt does a create or replace view command. I’ve tried a couple of options so far:

  1. Set the columns: false inside the persist_docs setting within the dbt_project.yml file (persist_docs | dbt Developer Hub). However, it did not work, and the column definitions do get overwritten each time a dbt build is run.

  2. I am looking to do something similar as in this post - How to generate tables and its columns schema yaml reading from snowflake information_schema.columns. However, I do not want to build the whole yml file dynamically, but just set the column comments from the information schema into yml file. I am thinking to pursue this option if nothing else works.

  3. I can run a post-hook to run my Snowflake stored procedure which updates the column definitions after the dbt jobs run, but I’m not sure if I should call it at the very end of the dbt runs, or individually inside the models where my views are defined.

Please share if anyone has run into a similar issue, and how did they resolve it.

Thanks much!