Preserving Snowflake Time Travel in dbt Models

Hello,

I’m running into an issue with dbt where my models/tables are recreated every time I run the project, even if the structure hasn’t changed. This breaks Snowflake’s Time Travel for these tables. I’d like to understand the best approach to avoid losing historical versions.


The problem I’m having

Every dbt run with models materialized as table uses CREATE OR REPLACE TABLE. This recreates the table from scratch, which resets Snowflake’s Time Travel. As a result, I cannot access historical data if we need to retrieve it after a mistake.


The context of why I’m trying to do this

Time Travel is not used for business analytics, but as a safety mechanism for our team to retrieve historical data in case of mistakes or accidental data changes. Recreating the tables each run breaks this functionality, and I need a strategy to preserve historical data while still refreshing the models.


What I’ve already tried

  • Using standard materialized='table' models (default behavior).

  • Considering incremental models, but I’m not sure this is the right approach for preserving Time Travel rather than just optimizing loads.


Some example code or error messages

Example model currently causing issues:

{{ config(materialized='table') }}

select *
from staging.source_table;

Snowflake error when trying to query historical data:

Time travel data is not available for the table.
The requested time is either beyond the allowed time travel period or before the object creation time.


I’m looking for guidance on best practices in dbt to preserve Snowflake Time Travel for these tables, specifically for internal recovery purposes, without unnecessarily using incremental logic everywhere.

Thank you!
Sarah