The problem I’m having
Looking for suggestions on where to push and save dbt models used for one-time, ad hoc analysis.
The context of why I’m trying to do this
Using dbt Cloud and Snowflake.
I had to perform a backfill on a database table. To validate, I created a model that executes against a backup of the table before the backfill. I created a second model that executes against the database table after the backfill.
When done, I have two tables that I was able to join and look for errors in the backfill. I used my dbt development schema for my user for the validation.
Now, I have these two dbt models, and related changes to the relevant .yml
files in a feature branch.
These models will not run in production. It was a one-time effort.
I don’t want to lose the work. Yet, I don’t want to put them in the models
directory with production models.
What I’ve already tried
I thought about putting them in the analysis
folder. There are changes to the .yml
files in my branch. It’d be ok if I lose the content in the .yml
files. I could replace the entries in the .yml
files if needed. There is the issue of having non-production models and artifacts in a repo where the production models are saved.
I also also looked at committing to a repo that is used for ad hoc work. There is not file history in the repo. I’d probably copy-n-paste the models in .sql
files in the repo used for ad hoc work, and commit them.
I also thought about leaving the models and related changes in a branch in the production repo, unmerged. Let it sit there in the repo until it’s obsolete or until it reaches the stale date in a README file in the branch.