For incremental models, dbt generates the following for all the “character varying” attributes:
alter table <table> add column <col>__dbt_alter character varying(17);
update <table> set <col>__dbt_alter = <col>;
alter table <table> drop column <col> cascade;
I am using Postgre database and dealing with approx 15 Million of data, most of the times the code timeout as running alter table on the huge data is time consuming. Accessing table during this time is also not friendly.
The behavior is not consistent, I did not see it happening for each load.
Any suggestion would be greatly appreciated.