I’m trying to use the dbt-external-tables package with my Snowflake account, through dbt Cloud. This should create raw tables for me, using S3 as the source. Below is my source configuration:
When I run dbt run-operation stage_external_sources I get the following error (a sample of the full log):
17:34:40 Opening a new connection, currently in state init
17:34:41 Snowflake adapter: Snowflake query id: 01bbb5be-0304-f1f7-0004-2c6700109552
17:34:41 Snowflake adapter: Snowflake error: 002043 (02000): SQL compilation error:
Object does not exist, or operation cannot be performed.
17:34:41 Snowflake adapter: Error running SQL: macro list_relations_without_caching
17:34:41 Snowflake adapter: Rolling back transaction.
17:34:41 While listing relations in database=dev, schema=_raw, found:
17:34:41 Using snowflake connection "macro_stage_external_sources"
17:34:41 On macro_stage_external_sources: /* {"app": "dbt", "dbt_version": "2025.4.7+8c17c1e", "profile_name": "user", "target_name": "default", "connection_name": "macro_stage_external_sources"} */
show terse schemas like '_raw' in database dev limit 1;
17:34:42 Snowflake adapter: Snowflake query id: 01bbb5be-0304-f8e9-0004-2c670010c0b6
17:34:42 Snowflake adapter: Snowflake error: 002043 (02000): SQL compilation error:
Object does not exist, or operation cannot be performed.
17:34:42 Snowflake adapter: Error running SQL: macro stage_external_sources
17:34:42 Snowflake adapter: Rolling back transaction.
17:34:42 Encountered an error while running operation: Database Error
002043 (02000): SQL compilation error:
Object does not exist, or operation cannot be performed.
The environment variable resolves to the name “dev” for the database. I have confirmed in Snowflake that I have the proper privileges and grants to select from this database and external stages within it.
My personal credentials are using a role granted to my user that has full access to the database and all its schemas.
I’ve tried:
Adding the database and warehouse to my personal credentials
Removing the environment variable and directly using the database name in the source config
Any idea why dbt-external-tables can’t correctly connect to the database in Snowflake?
There’s a specific privilege in Snowflake to create external tables (different from just create table) did you check that the user has that permission inside the schema that it’s working on?
It’s strange, when I do queries directly in Snowflake, quoting the various parts of the fully qualified namespace, like “db”.“schema”, things seem to work. So maybe there’s something with how quoting works in the yml source config?
Normally you don’t need quoted identifiers in Snowflake unless the object was created with them originally, so if you created that database/schema using something like “_raw” then it shouldn’t matter
If you can’t run normal statements on that same database and schema without quotes in Snowflake’s on interface, then that would be a sign that the database name or schema name was quoted on creation
If you want to check, see this doc on enabling quoted identifiers on sources: https://docs.getdbt.com/reference/resource-properties/quoting but my recommendation to save yourself a lot of headaches if you can is to rename the database and/or schema to not use quoted identifiers
You can check by just trying to run use database.schema in the Snowflake UI. If it only works with quotes around the database and/or schema name, then that’s what’s happening.
They’re a pain because not all tools support/understand quoted identifiers so even though you can set dbt to use them on a source by source basis, it can become a headache with other tools downstream
show objects in dev._raw limit 10000;
show terse schemas like ‘_raw’ in database dev limit 1;
show objects in dev._raw limit 10000;
show terse schemas like ‘_raw’ in database dev limit 1;
Basically if you don’t use quotes in Snowflake converts all object names to uppercase, but if you use quotes it’s case sensitive so dev._raw matches “DEV”.“_RAW” but not “dev”.“_raw”. Because the latter contains lowercase letters, it can only be referenced using quotes
If Terraform created the database/schema as “dev”.“_raw” then you either get Terraform to drop the quotes, always use uppercase, or always use quoted identifiers and lowercase names in Snowflake
Mark, apologies for the delay as I fought my way through this. Yes indeed, using all caps worked, and is the more extensible solution for the future. Really appreciate the help.