Dynamic Column Generation and Incremental Updates

The Context

I’m creating a database using dbt. My source system contains the information about which columns my table should contain.
My approach so far was to have a macro generate the tables for me and it updates the columns accordingly if the information inside the source system changes.
Here is some simplified Pseudo Code of a model called “my_table_name”:

-- retrieve information from source system by using "run_query()" 
{% set column_array = get_columns_from_source_system("my_table_name") %}
-- generate the table dynamically with the provided columns
{{ generate_table_sql("my_table_name",  column_array }}

The “get_columns_from_source_system” macro uses “run_query()” to retrieve the required information from the source system.
This works fine for full refreshes, but I’m facing trouble when implementing incremental updates.

My Goal

When the columns of the table stay the same, I want to have the tables updated incrementaly.

config(
		materialized='incremental',
		unique_key='id'
)

When the columns of the table should change, then I I want to materialize the model as a table again:

config(
		materialized='table'
)

What I have tried so far

I tried to implement this the following way:

-- retrieve information from source system by using "run_query()" 
{% set column_array = get_columns_from_source_system("my_table_name") %}
-- returns true if the current table does not contain exactly the columns defined in the column_array
{% set columns_have_changed = have_columns_changed(column_array, this) %}

-- if the columns chage changed, do a "full" refresh, otherwise update incrementaly
{% if columns_have_changed %}
config(
		materialized='table'
)
{% else %}
config(
		materialized='incremental',
		unique_key='id'
)
{% endif %}

-- generate the table dynamically with the provided columns
{{ generate_table_sql("my_table_name",  column_array }}

This unfortunately does not work, since the macro “get_columns_from_source_system” only returns any values when my model is being compiled and run.
But during that time dbt has already decided to materialize my model as a table, instead of incrementaly.

Running with dbt=1.6.5
Registered adapter: snowflake=1.6.4

LogInfo: column_array: []
LogInfo: Columns have changed. Materializing as TABLE
LogInfo: "is_incremental()" ==  False

Found 275 models, 1 analysis, 1 seed, 632 tests, 22 sources, 0 exposures, 0 metrics, 505 macros, 0 
groups, 0 semantic models

LogInfo: column_array: ['col_1', 'col_2', 'col_3']
LogInfo: Columns have not changed. Materializing INCREMENTAL
LogInfo: "is_incremental()" ==  False

1 of 1 OK created sql table model inc.incremental_test ............. [SUCCESS 1 in 6.03s]
Finished running 1 table model in 0 hours 0 minutes and 10.95 seconds (10.95s).

Completed successfully
Done. PASS=1 WARN=0 ERROR=0 SKIP=0 TOTAL=1

Question

Is there a way with which I can get the column information before dbt has determined how it materializes the model?
Alternatively I could also ask if there is a way to retrieve any kind of information using the “run_query()” command before dbt starts compiling.

Thanks in advance :smiley:

Why do you want to change materialization why not have a pre trigger to add the new column.
You already have a lust

Note: @Nick Pinfold originally posted this reply in Slack. It might not have transferred perfectly.