Change in data type for a field not considered in dbt snapshot

I’m running a snapshot on a table, but on specific fields like follows:

{% snapshot prestaging_pipeline_check %}

    {{
        config(
          target_schema='snapshots',
          strategy='check',
          unique_key='col_V',
          check_cols=['col_Y', 'col_Z'],
        )
    }}

    select * from {{ref("table_X")}}

{% endsnapshot %}

However, the snapshot seems to fail on a completely different field in the table that has changed it’s data type.

08:58:52  
08:58:52  Database Error in snapshot prestaging_pipeline_check (snapshots/prestaging_pipeline_snapshot.sql)
08:58:52    Value has type STRING which cannot be inserted into column col_A, which has type ARRAY<STRING> at [16:93]
08:58:52    compiled SQL at target/run/datapipelines/snapshots/prestaging_pipeline_snapshot.sql
08:58:52  

Looking at the logs, snapshot does try to access the entire table, but shouldn’t it be using the specific columns mentioned in check_cols?
Any suggestions on how to fix this without having to drop the original table?

Information:
Running on dbtCloud

col_A is included in the select * statement, which is why it’s coming up with the error.

check_cols refers to which columns will be checked to see whether the row has changed and needs a new snapshot created. At least one of the new/changed records in the table is a string instead of an ARRAY<STRING>, and so it can’t be inserted into the table.

If you don’t want that column to be included in your snapshot, you could exclude it from the select statement, or cast it as an array instead of a single string.

If you remove it from the select statement, you’ll need to drop the column from the snapshot table (or leave it in your select statement and do something like select null as col_A)