The first option is absolutely how users use Airflow to orchestrate dbt Cloud jobs. For more flexibility, you can use the steps_override parameter to change the dbt command (assuming you’re dynamically changing this). You will need to keep in mind how our scheduler queues jobs though, as you can’t run the same job ID in dbt Cloud in parallel, but you can run as many unique jobs in parallel as you wish.
As for option 2, you’d be executing dbt-core, thereby losing many of the advantages you get by running a job in dbt Cloud: job artifacts management (to drive CI/Merge job state comparison), Explore page metadata population, inability to make cross-project references (if you’re doing this) and more! I’m not saying you can’t do this and make it work, but on the long term going with option 1 will be a better experience.