Snowpipe with dbt

Hi, in my company we already have a pipeline developed in snowflake in which through external stages and snowpipe se load daily objects (csv files) from and S3 bucket to a table. After that, a process through streams, tasks, sp, etc process the data into a final table. Now we are asked to move all the logic of the process to dbt.

I’m very new to the tool so as suspected i’m having a lot of troubles to understand how to move the logic of this objects in snowflake to dbt. The first problem thats araising is, through a cawa we are gonna call the copy into thats inside of a snowpipe to load the data on a specific table.

For now i only tried to use a model that only have this (not the part of the cawa configuration, thats aside):

load_table.sql:

{{ config(
pre_hook = “{{ name_of_a_macro_in_another_folder(‘name_of_the_table’) }}”)
}}

And the macro looks like this:

{% macro name_of_a_macro_in_another_folder(table) %}
{{ log(“load starts”, True) }}
{% set dml_transaction -%}
begin;
COPY INTO DB.SCHEMA.{{table}}
FROM ‘@STAGE/{{table}}’ ------- I’m not creating this stage on dbt cause the stage already exists in snowflake as i mentioned
FILE_FORMAT = (TYPE = ‘csv’ FIELD_DELIMITER = ‘;’ SKIP_HEADER = 1 EMPTY_FIELD_AS_NULL = TRUE NULL_IF = (‘’) RECORD_DELIMITER = ‘\n’ FIELD_OPTIONALLY_ENCLOSED_BY = ‘"’ ESCAPE_UNENCLOSED_FIELD =‘\’)
ON_ERROR = CONTINUE
FORCE = TRUE;
commit;
{%- endset %}
{{ log(“load ends”, True) }}
{% endmacro %}

The problem is when i run: dbt run --select load_table

The error is indicating that the execution of the “load_table” model is not calling the macro, instead calls another model that has nothing to do with it,

Thoughts?

Thanks