Hi Everyone
We run the ETL process through Cosmos, which runs the dbt operations via Airflow, populating Databricks tables with the output.
I need to add metadata from Airflow to Databricks Tables, such as execution_date or task_id .
I am looking for the best practice to do that.
Please Advise
Thanks
Mali