Debugging a Failing dbt run with Iceberg + Spark + GCS on Windows

MAIN ISSUE:
I am able to connect to GCP Iceberg catalog with data using spark, but I am not able to connect to the same catalog using dbt. One time its adding iceberg_catalog (which i havent design) and sometimes its just cant find the iceberg table. In below version 2 it can be seen that its not even taking my catalog name.

While building a modern data pipeline using dbt, Apache Spark, and Apache Iceberg, I encountered a frustrating error during the dbt run process. My setup used Google Cloud Storage (GCS) as the backing warehouse, and I was aiming to load Iceberg tables stored in gs://flink_iceberg_data_bucket/iceberg_warehouse via the Spark Thrift Server (or method session) . Here’s how I set it up, what worked, what failed—and how I’m troubleshooting the problem.

THIS SPARK COMMAND RETURNS DATA!:
$env:GOOGLE_APPLICATION_CREDENTIALS=“C:\Users\xxxx.dbt\dev_key.json”

& “C:\tools\spark\spark-3.5.0-bin-hadoop3\sbin\start-thriftserver.cmd” --master local[*] --jars “C:\tools\spark\spark-3.5.0-bin-hadoop3\jars\gcs-connector-hadoop3-latest.jar” `
–conf spark.sql.catalog.spark_catalog=org.apache.iceberg.spark.SparkCatalog --conf spark.sql.catalog.spark_catalog.type=hadoop --conf spark.sql.catalog.spark_catalog.warehouse=gs://flink_iceberg_data_bucket/iceberg_warehouse --conf spark.hadoop.fs.gs.impl=com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem --conf spark.hadoop.fs.AbstractFileSystem.gs.impl=com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS --conf spark.hadoop.google.cloud.auth.service.account.enable=true --conf spark.hadoop.google.cloud.auth.service.account.json.keyfile=“C:\Users\xxxx.dbt\dev_key.json”

spark.sql(“SELECT * FROM gcs_iceberg_catalog.gcs_db.test_game_events LIMIT 5”).show()

Following dbt settings cant find ICEBERG catalog:
I’m running:

  • dbt v1.9.6
  • Spark v3.5.0 (Windows)
  • Iceberg catalog via GCS
  • dbt-spark plugin v1.9.2
  • Spark Thrift Server with GCS + Iceberg config

profiles.yml:
outputs:
local:
type: spark
method: session
host: localhost
catalog: gcs_iceberg_catalog
schema: gcs_db
session_properties:
spark.sql.catalog.gcs_iceberg_catalog: org.apache.iceberg.spark.SparkCatalog
spark.sql.catalog.gcs_iceberg_catalog.type: hadoop
spark.sql.catalog.gcs_iceberg_catalog.warehouse: gs://flink_iceberg_data_bucket/iceberg_warehouse
spark.hadoop.fs.gs.impl: com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem
spark.hadoop.fs.AbstractFileSystem.gs.impl: com.google.cloud.hadoop.fs.gcs.GoogleHadoopFS
spark.hadoop.google.cloud.auth.service.account.enable: “true”
spark.hadoop.google.cloud.auth.service.account.json.keyfile: C:/Users/xxx/.dbt/dev_key.json

dbt test view version 1:
{{ config(
materialized=‘view’,
) }}

SELECT * FROM game_events
LIMIT 10

ERROR1:
10:23:48 1 of 1 ERROR creating sql view model gcs_db.test_read … [ERROR in 0.42s]
10:23:48
10:23:48 Finished running 1 view model in 0 hours 0 minutes and 19.97 seconds (19.97s).
10:23:48
10:23:48 Completed with 1 error, 0 partial successes, and 0 warnings:
10:23:48
10:23:48 Runtime Error in model test_read (models\gamebot_iceberg\test_read.sql)
Runtime Error
[TABLE_OR_VIEW_NOT_FOUND] The table or view game_events cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.; line 8 pos 14;
'CreateViewCommand spark_catalog.gcs_db.test_read, SELECT * FROM game_events
LIMIT 10, false, true, PersistedView, false
± 'GlobalLimit 10
± 'LocalLimit 10
± 'Project [*]
± 'UnresolvedRelation [game_events], , false

ERROR2
dbt test view version 2:
{{ config(
materialized=‘view’,
catalog=‘gcs_iceberg_catalog’,
database=‘gcs_db’
) }}

SELECT * FROM game_events LIMIT 10

error is:
10:29:20 1 of 1 ERROR creating sql view model gcs_db.test_read … [ERROR in 0.41s]
10:29:20
10:29:20 Finished running 1 view model in 0 hours 0 minutes and 18.37 seconds (18.37s).
10:29:20
10:29:20 Completed with 1 error, 0 partial successes, and 0 warnings:
10:29:20
10:29:20 Runtime Error in model test_read (models\gamebot_iceberg\test_read.sql)
Runtime Error
[TABLE_OR_VIEW_NOT_FOUND] The table or view game_events cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS.; line 8 pos 14;
'CreateViewCommand spark_catalog.gcs_db.test_read, SELECT * FROM game_events LIMIT 10, false, true, PersistedView, false
± 'GlobalLimit 10
± 'LocalLimit 10
± 'Project [*]
± 'UnresolvedRelation [game_events], , false