OSS Unity Catalog

Hello, does anyone know if it is possible to use OSS Unity Catalog with dbt-core and dbt-spark? I am using OSS Apache Spark 4.1.1 with OSS Unity Catalog 0.4.0 on my local Windows laptop in WSL. So far I have being trying to use a Spark session and passing in the spark config using the dbt profile as follows…

default:

outputs:

dev:

type: spark

method: session

host: localhost

catalog: bronze

schema: dbt

server_side_parameters:

“spark.jars.packages”: “io.unitycatalog:unitycatalog-spark_2.13:0.4.0,io.delta:delta-spark_2.13:4.1.0,io.delta:delta-kernel-api:4.1.0,io.delta:delta-kernel-defaults:4.1.0”

“spark.sql.extensions”: “io.delta.sql.DeltaSparkSessionExtension”

“spark.sql.catalog.spark_catalog”: “org.apache.spark.sql.delta.catalog.DeltaCatalog”

“spark.databricks.delta.preview.enabled”: “true”

“spark.databricks.delta.catalog.update.enabled”: “true”

“spark.sql.catalog.bronze”: “io.unitycatalog.spark.UCSingleCatalog”

“spark.sql.catalog.bronze.uri”: “ttp://localhost:8080”

“spark.sql.catalog.bronze.token”: “”

# “spark.sql.defaultCatalog”: “bronze”

target: dev

Whatever I try it always writes to my local project folder in VSCode. I have also tried to modify my spark-defaults.conf without success too.

Thanks