i’m using dbt
to materialize a table on a postgresql
. this table has a temporal field (timestamp
).
the problem is that when this table is being created for the first time dbt
runs all of it using a single sql query (as defined in my model) and it gets timed out as it exceeds statement_timeout
, and no data is inserted in the table even as a lot of proccessing has been done.
is there any solution which breaks up the data into several steps and commits each step as it gets completed? this way the dbt run
would still take a long time but each step is a separate query and it wouldn’t surpass statement_timeout
i know about the incremental
option of dbt
and i use it in next runs, but my problem is about the initial run.
thanks