How to design 2 seperated dbt job write to the same table? DBT won't allow duplicate model file name

We’re migrating an old Pentaho ETL job to dbt. Currently, we encounter this problem. There are 2 separate jobs that write to the 1 destination table, 1 general job to insert/update data, and 1 job specifically to update only some specific columns.

In this phase of migrating, we want to keep the logic 1-1 first, so we want to create 2 dbt jobs separated that write to the same table instead of trying to merge these 2 job logics.

The problem is that dbt uses the model file name as the destination table name, and it won’t allow duplicate file names, even in different folders. I try alias config, but it won’t work in dbt run either, dbt still uses the file name when generating create/merge query

For example, we need 2 jobs that write data to tableA. I tried something like

tableA.sql

config{…}

sql logic here

and tableA_1.sql

config{

alias = ‘tableA‘

}

sql logic here

When I run

dbt run –select tableA_1.sql

Instead of writing to tableA (in alias), dbt still tries to create and write to tableA_1. Seems like alias only work when using ref {{ ‘alias‘ }}

How to implement a multi-dbt job with 1 destination table logic in dbt? Any trick to bypass the non-duplicate model file name in dbt?

Many cases have this situation in the ETL process. Sometimes you can’t just shuffle everything in 1 job

dbt can’t have 2 models resolve to the same db object.

Note: @Jeremy Yeo originally posted this reply in Slack. It might not have transferred perfectly.

Is there any trick to implement this?

Not really. Seperate dbt projects or maybe write your own custom materialization…

Note: @Jeremy Yeo originally posted this reply in Slack. It might not have transferred perfectly.

Wait are you a freaking bot? Bip bop bip bop I need human interaction