Performance of each model over time

For all the projects where I use dbt, there’s one thing from the operational side of things that’s missing. The ability to get the data about how a specific model performed over time.

When the whole job takes more time, you have to go to the logs of that job in order to get some information. How do you obtain that information?

1 Like

Great question! The most common approaches I see are:

  1. If you use dbt Cloud, you can use the Model Timing Tab which does exactly this:
  2. Alternatively, you can also use a package like dbt_artifacts which creates tables in your warehouse containing the results of each dbt invocation, including time spent building each table.

I don’t have this option, probably because I’m not on Team plan. The question is does it only show Model timing for a specific run or for all the runs?

The model timing tab shows the analytics for a single run at a time. The data stored by the artifacts package enables you to compare performance from one run to another

Any plans to make such important data available in an easy way, without installing extra packages? Operationally it was the first thing I asked when considering dbt.

If you were on the teams plan you could also use direct access to the metadata api to track performance over time.

I don’t have a specific tool recommendation, but that there are also other observability platforms which will ingest this data and handle that for you if you don’t want to build it yourself.

I don’t have anything to share on any future product plans, but will pass the message on! I will note that the fact that users can extend the built in dbt functionality and share their results with the community as packages/plugins is a key benefit.

1 Like