Has anyone leveraged snowflake streams as a source table for incremental loads in dbt? The stream will give you the inserts/update/deletes “auto-magically”, so there will been need to write “update_date”, or “Check” logic to get the data that changed
When dbt issues dml on the stream, the stream will be cleared, and ready for the next execution of dbt run related to that table.
Or, alternatively, dbt can first insert from the stream to a “stage” transient table (that gets truncated and reloaded in each run via a pre-hook), and then another dbt model can read from that table.
Either way, the data in the stream is “bullet proof”, and requires no additional logic to capture source changes.