[Snowflake adapter] Supports transactions (pre-hook + model run in same transaction?)

As per docs suggest only Postgres or Redshift adapters support transactions (meaning pre-hook(s) run in same transaction as model).

  1. So i wonder if Snowflake adapter as-of today 5-OCT-2024 still doesnt support it?
  2. where can i find a web page listing adapters vs transaction support (yes| no column for different target databases?

Thanks~

Here some test code I prepared, if anyone could test this:

--dbt-snowflake adapter test transaction support (pre-hook, post-hook + model run in same transaction).sql


delete from db.s.stg_table;


create or replace temporary table db.s.stg(id number);
create or replace temporary table db.s.control(batch_id number, s varchar(10));


--================================================================================================--
--test 1 - a successful pre-hook

--pre-hook:
{{ config(
    pre_hook="delete from db.s.stg_table" | "insert into db.s.control (select '1', 'started..')",
    post_hook="insert into db.s.control (select '1', 'completed!')",
) }}

--model:
insert into db.s.stg_table (select * from values (10),(20));
---------------------------------------------------------------------------


--after dbt run both tables should show:
select * from db.s.control;
BATCH_ID|S         |
--------+----------+
       1|started.. |
       1|completed!|

select * from db.s.stg_table;
ID|
--+
10|
20|




--================================================================================================--
--test 2 - force error on pre-hook (giving varchar too long for example)
--         stg_table should not be deleted if snowflake adapter supports transactions!!!!

--pre-hook:
{{ config(
    pre_hook="delete from db.s.stg_table" | "insert into db.s.control (select '2', 'started.. **OOPS TOO BIG VARCHAR ON PRE-HOOK')",
    post_hook="insert into db.s.control (select '1', 'completed!')",
) }}

--model:
insert into db.s.stg_table (select * from values (33);
---------------------------------------------------------------------------


--after dbt run both tables should noe have been affected, meaning they should show same rows as after 1st test:
select * from db.s.control;
BATCH_ID|S         |
--------+----------+
       1|started.. |
       1|completed!|

select * from db.s.stg_table;
ID|
--+
10|
20|




--================================================================================================--
--test 3 - force error on post-hook (giving varchar too long for example)
--         stg_table should not be deleted if snowflake adapter supports transactions!!!!

--pre-hook:
{{ config(
    pre_hook="delete from db.s.stg_table" | "insert into db.s.control (select '2', 'started..')",
    post_hook="insert into db.s.control (select '1', 'completed!  **OOPS TOO BIG VARCHAR ON POST-HOOK!!!!')",
) }}

--model:
insert into db.s.stg_table (select * from values (33);
---------------------------------------------------------------------------


--again, after dbt run both tables should noe have been affected, meaning they should show same rows as after 1st test:
select * from db.s.control;
BATCH_ID|S         |
--------+----------+
       1|started.. |
       1|completed!|

select * from db.s.stg_table;
ID|
--+
10|
20|

Thanks ~

I tested this and can confirm that each SQL statement in the pre & post hooks seem to be run in its own transaction. When the 2nd statement in the pre-hook broke due to overflow, the STG table was truncated as per the 1st SQL statement in the prehook.

I tried to fudge this by adding the 1st pre-hook as ‘BEGIN TRANSACTION’ and the last post-hook to be ‘COMMIT’. That worked in the sense that the tables were not updated. However, on the next run, dbt got stuck on the delete prehook, indicating a lock due to the previous failed transaction not explicitly being rolled back