any tip/tool to get some insight from the log details
ex
- longest running one
etc
yes I can parse with some bash script or even import somehow in sqlite and play with it … open to suggestion
any tip/tool to get some insight from the log details
ex
yes I can parse with some bash script or even import somehow in sqlite and play with it … open to suggestion
Hey @obar1, have you seen the dbt_artifacts package on the dbt Package Hub? It contains macros which will materialise your job timing etc, including how long it takes for a model to build.
Also, if you’re using dbt Cloud, you can use the Model timing tab to visualise your longest-running nodes.
cool and handy feature would be adding to the model timing tab a export log kind of button to have it as text … I need to mouse over to see details as it is thanks
hi @joellabes have you seen this Using dbt artifacts to track project performance - Show and Tell - dbt Community Forum (getdbt.com) playing with run_results.json
looks a practical solution
I have! The first version of dbt_artifacts parsed run_results.json (which is how it got its name) but it changed to using the run’s context variables as its data source so it was usable by other warehouses than just Snowflake