Saving test error "notifications" for Power BI

Hi all,

Our current setup is as follows:
we have a Snowflake datalake, on which we run several dbt models to generate our Marts. These Marts are then transformed into Analytics models, which are the direct input for our Power BI reports.
Now the issue becomes, whenever a test error occurs, all models downstream are skipped and thus not all data in Power BI is up-to-date. We were wondering what our options are to display such errors in Power BI and let our end users know that something is not updated. I tried the --store-failures option, but that just stores all failures in new tables. All we need is for example a table TEST_ERRORS that literally just writes a single row whenever any run fails to complete successfully. Then I can lookup this table in Power BI for a specific date that the test ran and see if there are any rows here to show a message to the users. But I have no idea how to set this up.

How do your organizations / setups handle issues like these? Thanks!

Kind regards,
Sven

Hi Sven,

I just arrived at this forum and am just embarking on a dbt trip so maybe not that valuable feedback.
If you have the extended test/status output persisted (like amazon web services - DBT - write dbt test --store-failures to a specific table in my data warehouse - Stack Overflow), couldn’t you use then that extensive information to summarize it yourself into 1 custom status line?

Or an uglier hack could be that you check in Power BI if the latest incoming data exceeds a minimum timestamp you would expect each refresh of a semantic model e.g. You expect each day audit events coming into a table in Power BI, so in Power Query you calculate if there are any events with timestamp > then the minimum timestamp you would expect each day.