Order of execution: macros vs. pre_hook for an incremental model

I need some help or guidance in here.

I have an incremental model which creates a temp table as part of a pre_hook statement.
Later it should execute a macro which call a stored procedure. The create temp table is an input parameter of the sp.

My problem is the macro executes before the pre_hook gets executed.

Is there anyway to achieve this?

I will attach the model and the dbt logs.

dbt MODEL: dx_store_plck.sql

{{
    config(
        materialized='incremental',
        unique_key='DX_ID',
        pre_hook=[
                """
                    CREATE OR REPLACE TEMP TABLE DXH_STORE_PLCK_IN
                    AS
                    SELECT DISTINCT
                        s.dx_id AS msa_id,
                        s.btg_store_nm AS store_name,
                        s.btg_city AS city,
                        s.btg_state AS state,
                        s.btg_address AS street_addr,
                        s.btg_zip5 AS zipcode,
                        'US' AS COUNTRY_CODE,
                        s.latitude AS LATITUDE,
                        s.longitude AS LONGITUDE,
                        COALESCE(p.last_updt_dt, TO_DATE('2023-01-01')) AS last_updt_dt
                    FROM {{ ref('dx_store_attr') }} s
                {%- if is_incremental() %}
                        FULL JOIN {{ this }} p USING (dx_id)
                    WHERE
                        (
                            p.placekey IS NULL
                        OR
                            DATEDIFF(day, p.last_updt_dt::DATE, CURRENT_TIMESTAMP()::DATE) > 90
                        )
                    ORDER BY last_updt_dt ASC, s.dx_id ASC
                    LIMIT 10000
                {%- endif -%}
                ;
                """
            ],
        tags=["dx", "refresh:daily"]
    )
}}

--load dx_store_plck

{{
    append_placekeys(
        tbl_in='DXH_STORE_PLCK_IN',
        tbl_out='DXH_STORE_PLCK_TEMP'
    )
}}

SELECT
    t.msa_id AS dx_id,
    t.placekey AS placekey,
    t.error AS error,
    CURRENT_TIMESTAMP() AS last_updt_dt
FROM
    {{ database }}.{{ schema }}.DXH_STORE_PLCK_TEMP AS t

dbt LOG:

============================== 2023-01-13 21:46:41.153585 | 02dee2e8-4e8a-4811-b95c-53983cd91787 ==============================
e[0m21:46:41.153646 [info ] [MainThread]: Running with dbt=1.3.0
e0m21:46:45.658109 [info ] [MainThread]: Concurrency: 8 threads (target='dev')
e[0m21:46:45.659835 [info ] [MainThread]: 
e[0m21:46:45.704811 [debug] [Thread-1  ]: Began running node model.sdna_us_project.dx_store_plck
e[0m21:46:45.708568 [info ] [Thread-1  ]: 1 of 1 START sql incremental model HOUSEHOLD_DEV.dx_store_plck ................. [RUN]
e[0m21:46:45.712115 [debug] [Thread-1  ]: Acquiring new snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:46:45.714225 [debug] [Thread-1  ]: Began compiling node model.sdna_us_project.dx_store_plck
e[0m21:46:45.717461 [debug] [Thread-1  ]: Compiling model.sdna_us_project.dx_store_plck
e[0m21:46:45.727261 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e
e[0m21:46:45.730138 [debug] [Thread-1  ]: Opening a new connection, currently in state closed
e[0m21:46:46.411899 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.68 seconds
e[0m21:46:46.413582 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:46:46.602455 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.19 seconds
e[0m21:46:46.604585 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
eCALL DP_SDNA_US.HOUSEHOLD_DEV.APPEND_PLACEKEYS(
          'DXH_STORE_PLCK_IN',
          (
            SELECT object_construct(MAPPING.*)
            FROM (
              SELECT
                'msa_id' AS PRIMARY_KEY,
                'store_name' AS LOCATION_NAME,
                'city' AS CITY,
                'state' AS REGION,
                'street_addr' AS STREET_ADDRESS,
                'zipcode' AS POSTAL_CODE,
                'LATITUDE' AS LATITUDE,
                'LONGITUDE' AS LONGITUDE,
                'false' AS strict_address_match,
                'false' AS strict_name_match
            ) AS MAPPING
          ),
          'DXH_STORE_PLCK_TEMP', 'temp', 'get_placekeys_v', 1000
        );
e[0m21:54:05.939981 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 439.33 seconds
e[0m21:54:05.996189 [debug] [Thread-1  ]: Writing injected SQL for node "model.sdna_us_project.dx_store_plck"
e[0m21:54:06.007982 [debug] [Thread-1  ]: finished collecting timing info
e[0m21:54:06.009677 [debug] [Thread-1  ]: Began executing node model.sdna_us_project.dx_store_plck
e[0m21:54:06.028810 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:06.171949 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.14 seconds
e[0m21:54:06.174469 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:06.343966 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.17 seconds
e[0m21:54:06.419534 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:06.636179 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.21 seconds
e[0m21:54:06.637514 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:06.836104 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.2 seconds
e[0m21:54:06.844633 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
CREATE OR REPLACE TEMP TABLE DXH_STORE_PLCK_IN
                    AS
                    SELECT DISTINCT
                        s.dx_id AS msa_id,
                        s.btg_store_nm AS store_name,
                        s.btg_city AS city,
                        s.btg_state AS state,
                        s.btg_address AS street_addr,
                        s.btg_zip5 AS zipcode,
                        'US' AS COUNTRY_CODE,
                        s.latitude AS LATITUDE,
                        s.longitude AS LONGITUDE,
                        COALESCE(p.last_updt_dt, TO_DATE('2023-01-01')) AS last_updt_dt
                    FROM DP_SDNA_US.HOUSEHOLD_DEV.dx_store_attr s
                        FULL JOIN DP_SDNA_US.HOUSEHOLD_DEV.dx_store_plck p USING (dx_id)
                    WHERE
                        (
                            p.placekey IS NULL
                        OR
                            DATEDIFF(day, p.last_updt_dt::DATE, CURRENT_TIMESTAMP()::DATE) > 90
                        )
                    ORDER BY last_updt_dt ASC, s.dx_id ASC
                    LIMIT 10000;
e[0m21:54:09.451083 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 2.6 seconds
e[0m21:54:09.463680 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:09.633206 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.17 seconds
e[0m21:54:09.635563 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
create or replace  view DP_SDNA_US.HOUSEHOLD_DEV.dx_store_plck__dbt_tmp
     as (
    --load dx_store_plck
SELECT
    t.msa_id AS dx_id,
    t.placekey AS placekey,
    t.error AS error,
    CURRENT_TIMESTAMP() AS last_updt_dt
FROM
    DP_SDNA_US.HOUSEHOLD_DEV.DXH_STORE_PLCK_TEMP AS t
  );
e[0m21:54:09.980829 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.34 seconds
e[0m21:54:09.994182 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:10.200047 [debug] [Thread-1  ]: SQL status: SUCCESS 4 in 0.2 seconds
e[0m21:54:10.211034 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:10.399277 [debug] [Thread-1  ]: SQL status: SUCCESS 4 in 0.18 seconds
e[0m21:54:10.409359 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:10.570893 [debug] [Thread-1  ]: SQL status: SUCCESS 4 in 0.16 seconds
e[0m21:54:10.616046 [debug] [Thread-1  ]: Writing runtime sql for node "model.sdna_us_project.dx_store_plck"
e[0m21:54:10.632582 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
BEGIN
e[0m21:54:10.845295 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.21 seconds
e[0m21:54:10.847299 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
e[0m21:54:10.848481 [debug] [Thread-1  ]: On model.sdna_us_project.dx_store_plck: /* {"app": "dbt", "dbt_version": "1.3.0", "profile_name": "sdna_us", "target_name": "dev", "node_id": "model.sdna_us_project.dx_store_plck"} */
alter session set query_tag ='';
e[0m21:54:11.027177 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.18 seconds
e[0m21:54:11.030989 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"
*/
merge into DP_SDNA_US.HOUSEHOLD_DEV.dx_store_plck as DBT_INTERNAL_DEST
        using DP_SDNA_US.HOUSEHOLD_DEV.dx_store_plck__dbt_tmp as DBT_INTERNAL_SOURCE
        on 
                DBT_INTERNAL_SOURCE.DX_ID = DBT_INTERNAL_DEST.DX_ID
            

    
    when matched then update set
        "DX_ID" = DBT_INTERNAL_SOURCE."DX_ID","PLACEKEY" = DBT_INTERNAL_SOURCE."PLACEKEY","ERROR" = DBT_INTERNAL_SOURCE."ERROR","LAST_UPDT_DT" = DBT_INTERNAL_SOURCE."LAST_UPDT_DT"
    

    when not matched then insert
        ("DX_ID", "PLACEKEY", "ERROR", "LAST_UPDT_DT")
    values
        ("DX_ID", "PLACEKEY", "ERROR", "LAST_UPDT_DT")

;
e[0m21:54:11.918211 [debug] [Thread-1  ]: SQL status: SUCCESS 20000 in 0.88 seconds
e[0m21:54:11.922083 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck" */
COMMIT
e[0m21:54:12.400002 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.47 seconds
e[0m21:54:12.454299 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"*/
drop view if exists DP_SDNA_US.HOUSEHOLD_DEV.dx_store_plck__dbt_tmp cascade
e[0m21:54:12.647964 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.19 seconds
e[0m21:54:12.675216 [debug] [Thread-1  ]: Using snowflake connection "model.sdna_us_project.dx_store_plck"*/
use warehouse DP_SDNA_US_M
e[0m21:54:12.822067 [debug] [Thread-1  ]: SQL status: SUCCESS 1 in 0.14 seconds
e[0m21:54:12.827425 [debug] [Thread-1  ]: finished collecting timing info
e[0m21:54:12.829087 [debug] [Thread-1  ]: On model.sdna_us_project.dx_store_plck: Close
e[0m21:54:13.114290 [debug] [Thread-1  ]: Sending event: {'category': 'dbt', 'action': 'run_model', 'label': '02dee2e8-4e8a-4811-b95c-53983cd91787', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0xffff7808ed00>]}
e[0m21:54:13.118092 [info ] [Thread-1  ]: 1 of 1 OK created sql incremental model HOUSEHOLD_DEV.dx_store_plck ............ [e[32mSUCCESS 20000e[0m in 447.40s]
e[0m21:54:13.127935 [debug] [Thread-1  ]: Finished running node model.sdna_us_project.dx_store_plck
e[0m21:54:13.145933 [debug] [MainThread]: Acquiring new snowflake connection "master"
e[0m21:54:13.149558 [info ] [MainThread]: 
e[0m21:54:13.151689 [info ] [MainThread]: Finished running 1 incremental model in 0 hours 7 minutes and 30.39 seconds (450.39s).
e[0m21:54:13.153361 [debug] [MainThread]: Connection 'master' was properly closed.
e[0m21:54:13.343818 [debug] [MainThread]: Flushing usage events

Hi,

Multiple macros can be called using a list. Please refer the post,

following that approach,
pre_hook=[
“”"
CREATE OR REPLACE TEMP TABLE DXH_STORE_PLCK_IN
AS

“”", “{{append_placekeys(‘DXH_STORE_PLCK_IN’, ‘DXH_STORE_PLCK_TEMP’ )}}”
]
wont work?!

1 Like

Hi @bk123 ,

Your suggestion was correct. That was the final approach which worked out (even though the several dbt anti-patterns included in the model).

Thanks for your help.

Best regards,
raul

1 Like