How to use dbt for schema changes to a large database

I’m new to dbt and trying to understand how I would use it in the following scenario. Without going into the details of development/ qa/uat environments and the CI/CD process, we have an existing production database with multiple fact tables each of which receives about 10 million new records per day (PostgreSQL). If I wanted to merge two columns in one of those tables, for example, how does dbt handle deployment of the new tested schema to production without some kind of a migration script? How do I encode a migration script in dbt? Does dbt provide any tools to make such a migration run efficiently for multiple years’ worth of data (e.g. 3.65 B records / year )? Any documentation on this topic would be very helpful