How to use dbt with Snowflake as source but write models locally ?

Hi everyone,

I’ve successfully connected dbt to Snowflake and can read data and run tests. However, I’d like to clarify how to best set up the workflow I need:

  • I want Snowflake to be used only as the source — that is, dbt should only read from Snowflake tables.
  • I don’t want dbt to create or write any models or tables back to Snowflake.
  • Instead, I’d like dbt to create all model outputs locally (for example, in DuckDB or in files like CSV or Parquet), so I can later process or store the data elsewhere.

My questions are:

  1. Can I configure dbt to read from one source (Snowflake) but write to a different target (e.g., DuckDB or local files)?
  2. If yes, what’s the recommended way to set this up in profiles.yml? Should I define two different targets and use some kind of cross-database/source pattern?
  3. Alternatively, should I extract the data from Snowflake first (e.g., with a separate pipeline) and then point dbt only to local data sources?

I appreciate any best practices or examples you can share. Thanks!