Custom materialization to load from external table into Snowflake

Hi All,

Did any one test or create a Custom Materialization which uses COPY INTO command to load data from External table to snowflake ?

If you have created a Custom Materialization can you please share the details.

Hi @sudhirnune! You might find this post useful:

1 Like

Unfortunately that article is not relevant to this question, which is how to use Snowflake’s COPY INTO command in dbt. The article doesn’t mention COPY INTO at all.
It would be really useful to have a way to do this with dbt though.

There’s an example of how to use a custom materialization to load data using COPY INTO here (see section “third attempt”): http://mamykin.com/posts/fast-data-load-snowflake-dbt/

In this example, the custom materialization creates a temporary external stage (to an S3 bucket) and uses the COPY INTO … FROM SELECT… syntax.

Hey @NathanG! My intent wasn’t to state that that article was specifically the answer to a copy into materialization, but it is an example of a user’s custom materialization (which the author seemed to be looking for) :smiley:

Thanks for sharying the mamykin article–good stuff.

Another neat way of using DBT to load data into Snowflake is by using External Tables. DBT has supported Snowflake’s external tables since v0.14.1. If you define your datasets in your buckets as external tables and you have a sensible folder structure that allows you to define a partition key then you have all the ingredients to use the simple insert statements to load from external tables to Snowflake tables.

We do this at Canva and we made a custom materialisation called insert because we did not want the overhead of two writes that the incremental materialisation performs (for a good reason).

So at the end our compiled dbt SQL looks like

Pre Hook -> refresh external tables

insert into target select * from external table where { filter criteria }

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.