Hi,
We use dbt cloud for our transformations. Snowflake is our Data warehouse. However, we have two snowflake instances, one for development and one for production.
We use the dev snowflake instance for DEV and QA.
We have a dbt project in dbt cloud configured to dev Snowflake Environment. All our developers build their models here and we have two deployment environment configured for DEV and QA to test our dbt jobs.
We have configured a new project for PROD environment since we have a different snowflake instance. Code promotion is done using github so we use the prod branch in the prod environment to get latest code. But, how can we get the dbt jobs to this new project?
AFAIK there’s no existing feature for exporting and importing jobs or any other dbt Cloud resources. You can use dbt Cloud API to get job information from dbt Cloud and then re-create the jobs. I’m working on a dbt-cloud-cli (GitHub - data-mie/dbt-cloud-cli: dbt Cloud command line interface (CLI)) and I’m planning to add a job export-import feature during the next two weeks. Let me know if that’d be of interest to you
@Simo that’s interesting. Would love to try it out. Will reduce the time to recreate jobs and copy pasting the commands
@anas That’s great! I’ll let you know when the feature is available.
@anas Release 0.2.0 is out now! dbt-cloud-cli · PyPI Release 0.2.0 · data-mie/dbt-cloud-cli · GitHub
There are now the “dbt-cloud job export” and “dbt-cloud job import” commands that you can use to copy jobs between projects. Please let me know if you have any questions or feedback