Problem: when doing continuous integration testing on a BigQuery database, dbt Cloud will, by default, create datasets in your production BigQuery environment, prefixed with dbt_cloud_pr.
Solution: create a new, separate BigQuery project for CI builds and instruct dbt Cloud to build into this new BigQuery project instead when running the CI build job following a pull request.
-
Create a new Google Cloud Platform project and enable billing.
-
Add your existing production dbt Cloud BigQuery service account to the new GCP project; assign BigQuery Editor permissions to this existing account in the new project.
-
Create a CI environment in your existing dbt Cloud project.
-
Create a CI job in the CI environment
a. Environment target name: set the target name todbt_cicd
(or whatever you like, e.g. your new GCP project’s name)
b. Execution settings: defer to previous run state, choose a frequently-run production job. You might also want to set a run timeout; I used 3600 seconds to time out after 1 hour.
b. Commands to execute:dbt build --select state:modified+
d. Triggers: In the Webhooks tab, turn on “Run on Pull Requests” -
Add target-name logic to your project’s
dbt_project.yml
More details: Using custom databases | dbt Docsmodels: <your-resource-path-here>: +database: "{% if target.name == 'dbt_cicd' %}your-new-gcp-project{% else %}{{ target.database }}{% endif %}"