Dbt dev/prod data pipelines


First time adopter of dbt in the last couple months. I had a question around best practices in handling dbt runs for dev and prod. I’d like to:

  1. Have a daily job that runs dbt run to target a dev schema
  2. Run dbt test to make sure all the unit tests pass
  3. Run dbt run --target prod

A couple of questions:

  • Is this order of operations typical?
  • We’re using big query. Is it possible to have the the dev target in one GCP “dev” project and the prod target in a GCP “prod” project?

Hello there,

You can definitely target two different GCP projects.


# profiles.yml
  bigquery: &bigquery
    # run the following command to authenticate:
    # $> gcloud auth application-default login --scopes=https://www.googleapis.com/auth/bigquery
    dbname: analytics
    type: bigquery
    method: oauth
    threads: 10
    timeout_seconds: 300
    priority: interactive
    retries: 1

      <<: *bigquery
      project: acme-dev
      <<: *bigquery
      project: acme-prod
  target: dev

@pcreux I haven’t seen this <<: syntax before - does this let you take all of the contents of an anchor and then append extra keys? That’s very exciting because it has always annoyed me that I couldn’t make that work!

@joellabes Yes! You’re totally right!

Watch this then :grinning: YAML Basics: Anchors and Aliases - YouTube

1 Like