Migrating a ETL to DBT: Target table conflict

The problem I’m having

I have 2 models that are inserting into the same table of my warehouse. (schema='my_schema', alias='my_table', materialized='incremental', incremental_strategy='append')
It raises me the error:

  dbt found two resources with the database representation "my_schema.my_table".
  dbt cannot create two resources with identical database representations. To fix this,

The context of why I’m trying to do this

I want to migrate my ETL jobs to the ELT dbt. The ETL lets me having different components inside the same job to load the same table, but dbt seems to have stronger requirements when it comes to targets.

Are there ways to get around this issue? Or do I need to use an external tool to handle the different inserts into the same target?
I think there is something with versions but I think it’s not suited if the dataflows of the jobs are different.

Can you create an intermediate model that joins your two resources, then use that model as the ref for your incremental?

the two resources don’t necessarily run at the same time.

But if your models are idempotent, it shouldn’t matter if you run them twice; so you can run them once when resource1 is updated, and a second time when resource2 is updated.

why does dbt raise me an error if it’s not an issue?

What is not an issue, or shouldn’t be an issue, is running a model more than once… even if the data hasn’t changed. This is following the logic I mentioned above about having an intermediate model joining your 2 sources.

What is an issue, and you won’t be able to do, is to have two models with the same name or the same alias.

And that’s the point of my question.
To insert into a table in different ways, I’m using “insert models”. For every insert models, I’m using the same alias. Each insert model select info from some models (these selected models differ depending of the insert model). The names of the insert models are different.

And I’m wondering if there are ways to get around the issue dbt raises when I do that.

I have the same problem when migrating a job from an old ETL tool to dbt. In the old ETL tool, there are many cases that required separate jobs to insert/update into the same 1 table. In dbt, that won’t be allowed, only 1 model per table. Based on my experience, there are some ways to work around.

  • Try to “merge“ all the logic into 1 job only, which is quite hard
  • Use prehook/posthook query. You can run multiple insert/update logic in the same model in prehook/posthook. But it’s quite limited if it has complex logic.
  • Use a staging layer instead. Write different logic for different staging/intermediate tables/models. Then, union/join all of them in the final model to write to the final table. I’m using this method.

This feels to me like you would want to build a customised materialization

My first thought is that you could have each model write into separate tables, then schedule another model that unions these two tables. This is similar to quangbilly79’s staging layer solution.