dbt-external-tables FAILS interacting with Snowflake

I’m trying to use the dbt-external-tables package with my Snowflake account, through dbt Cloud. This should create raw tables for me, using S3 as the source. Below is my source configuration:

version: 2

sources:
  - name: dc305_animals
    database: "{{env_var('DBT_DB_NAME')}}"
    schema: _raw
    loader: S3
    tables:
      - name: file91
        description: Data from the FILEOUT91 file
        identifier: raw_dc305_animals__file91
        external:
          location: "@{{env_var('DBT_DB_NAME')}}.external_stages.s3_staging_bucket"
          pattern: "owner=\\d+/datatype=management/provider=dc305_animals/year=\\d{4}/month=\\d{2}/day=\\d{2}/FILEOUT91.*[.]parquet"
          file_format: "(type = parquet)"
          infer_schema: true

When I run dbt run-operation stage_external_sources I get the following error (a sample of the full log):

17:34:40 Opening a new connection, currently in state init
17:34:41 Snowflake adapter: Snowflake query id: 01bbb5be-0304-f1f7-0004-2c6700109552
17:34:41 Snowflake adapter: Snowflake error: 002043 (02000): SQL compilation error:
Object does not exist, or operation cannot be performed.
17:34:41 Snowflake adapter: Error running SQL: macro list_relations_without_caching
17:34:41 Snowflake adapter: Rolling back transaction.
17:34:41 While listing relations in database=dev, schema=_raw, found: 
17:34:41 Using snowflake connection "macro_stage_external_sources"
17:34:41 On macro_stage_external_sources: /* {"app": "dbt", "dbt_version": "2025.4.7+8c17c1e", "profile_name": "user", "target_name": "default", "connection_name": "macro_stage_external_sources"} */
show terse schemas like '_raw' in database dev limit 1;
17:34:42 Snowflake adapter: Snowflake query id: 01bbb5be-0304-f8e9-0004-2c670010c0b6
17:34:42 Snowflake adapter: Snowflake error: 002043 (02000): SQL compilation error:
Object does not exist, or operation cannot be performed.
17:34:42 Snowflake adapter: Error running SQL: macro stage_external_sources
17:34:42 Snowflake adapter: Rolling back transaction.
17:34:42 Encountered an error while running operation: Database Error
  002043 (02000): SQL compilation error:
  Object does not exist, or operation cannot be performed.

The environment variable resolves to the name “dev” for the database. I have confirmed in Snowflake that I have the proper privileges and grants to select from this database and external stages within it.

My personal credentials are using a role granted to my user that has full access to the database and all its schemas.

I’ve tried:

  • Adding the database and warehouse to my personal credentials
  • Removing the environment variable and directly using the database name in the source config

Any idea why dbt-external-tables can’t correctly connect to the database in Snowflake?

I joined the Slack too, happy to be here

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

There’s a specific privilege in Snowflake to create external tables (different from just create table) did you check that the user has that permission inside the schema that it’s working on?

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

Hey Mark, I checked and the role that my user is assigned has that privilege inside the schema I want to create external tables in.

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

It’s strange, when I do queries directly in Snowflake, quoting the various parts of the fully qualified namespace, like “db”.“schema”, things seem to work. So maybe there’s something with how quoting works in the yml source config?

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

Are you seeing the failed query in Snowflake’s query history? That could also help if you see it’s not being rendered with correct names

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

Normally you don’t need quoted identifiers in Snowflake unless the object was created with them originally, so if you created that database/schema using something like “_raw” then it shouldn’t matter

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

If you can’t run normal statements on that same database and schema without quotes in Snowflake’s on interface, then that would be a sign that the database name or schema name was quoted on creation

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

If you want to check, see this doc on enabling quoted identifiers on sources: https://docs.getdbt.com/reference/resource-properties/quoting but my recommendation to save yourself a lot of headaches if you can is to rename the database and/or schema to not use quoted identifiers

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

I actually made this database and its schemas using terraform, so that could have something to do with it

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

You can check by just trying to run use database.schema in the Snowflake UI. If it only works with quotes around the database and/or schema name, then that’s what’s happening.

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

They’re a pain because not all tools support/understand quoted identifiers so even though you can set dbt to use them on a source by source basis, it can become a headache with other tools downstream

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

show objects in dev._raw limit 10000;
show terse schemas like ‘_raw’ in database dev limit 1;
show objects in dev._raw limit 10000;
show terse schemas like ‘_raw’ in database dev limit 1;

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

That was the query history for the commands run by dbt cloud that failed with the sql compilation error

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

I think you’re onto something with these quotes

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

And these definitely won’t work in the Snowflake UI

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

Basically if you don’t use quotes in Snowflake converts all object names to uppercase, but if you use quotes it’s case sensitive so dev._raw matches “DEV”.“_RAW” but not “dev”.“_raw”. Because the latter contains lowercase letters, it can only be referenced using quotes

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

If Terraform created the database/schema as “dev”.“_raw” then you either get Terraform to drop the quotes, always use uppercase, or always use quoted identifiers and lowercase names in Snowflake

Note: @Mark Estey originally posted this reply in Slack. It might not have transferred perfectly.

1 Like

Mark, apologies for the delay as I fought my way through this. Yes indeed, using all caps worked, and is the more extensible solution for the future. Really appreciate the help. :pray:

Note: @adam.c.leppek originally posted this reply in Slack. It might not have transferred perfectly.

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.