Install dbt_utils > dbt_project.yml not found

Hey, I am using dbt-core 1.5.4 via Airflow on Google Cloud Composer. Recently I started using dbt_utils and defined it in the packages.yml on the same level of the dbt_project.yml as follows:

packages:
  - local: ../other_project_1
  - local: ../other_project_2
  - package: dbt-labs/dbt_utils
    version: 1.3.0

80% of the time the DAG that execute the dbt build commands succeeds, but 20% of the time it fails with

Failed to read package: Runtime Error
No dbt_project.yml found at expected path /home/airflow/gcs/dags/dbt/ep_app/dbt_packages/dbt_utils/dbt_project.yml
Verify that each entry within packages.yml (and their transitive dependencies) contains a file named dbt_project.yml

Shouldn’t the dbt_project.yml always be available in dbt_utils? The configuration of succeeded and failed runs is 100% the same, so how is it possible that it randomly fails - anyone any clue what’s going on here?

Detailed log:

airflow-worker-9ffhr
*** Reading remote log from gs://europe-west1-everphone-bi-p-b71fcf36-bucket/logs/dag_id=ep_app_case_items/run_id=scheduled__2024-09-23T06:00:00+00:00/task_id=process_case_items/attempt=2.log.
[2024-09-23, 09:10:41 CEST] {taskinstance.py:1104} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: ep_app_case_items.process_case_items scheduled__2024-09-23T06:00:00+00:00 [queued]>
[2024-09-23, 09:10:41 CEST] {taskinstance.py:1104} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: ep_app_case_items.process_case_items scheduled__2024-09-23T06:00:00+00:00 [queued]>
[2024-09-23, 09:10:41 CEST] {taskinstance.py:1309} INFO - Starting attempt 2 of 2
[2024-09-23, 09:10:41 CEST] {taskinstance.py:1328} INFO - Executing <Task(DBTBashOperator): process_case_items> on 2024-09-23 06:00:00+00:00
[2024-09-23, 09:10:41 CEST] {standard_task_runner.py:57} INFO - Started process 15690 to run task
[2024-09-23, 09:10:41 CEST] {standard_task_runner.py:84} INFO - Running: ['airflow', 'tasks', 'run', 'ep_app_case_items', 'process_case_items', 'scheduled__2024-09-23T06:00:00+00:00', '--job-id', '1597443', '--raw', '--subdir', 'DAGS_FOLDER/ep_app_case_items.py', '--cfg-path', '/tmp/tmp2yn20tmk']
[2024-09-23, 09:10:41 CEST] {standard_task_runner.py:85} INFO - Job 1597443: Subtask process_case_items
[2024-09-23, 09:10:41 CEST] {task_command.py:414} INFO - Running <TaskInstance: ep_app_case_items.process_case_items scheduled__2024-09-23T06:00:00+00:00 [running]> on host airflow-worker-9ffhr
[2024-09-23, 09:10:42 CEST] {taskinstance.py:1547} INFO - Exporting env vars: AIRFLOW_CTX_DAG_EMAIL='analytics@everphone.de' AIRFLOW_CTX_DAG_OWNER='airflow' AIRFLOW_CTX_DAG_ID='ep_app_case_items' AIRFLOW_CTX_TASK_ID='process_case_items' AIRFLOW_CTX_EXECUTION_DATE='2024-09-23T06:00:00+00:00' AIRFLOW_CTX_TRY_NUMBER='2' AIRFLOW_CTX_DAG_RUN_ID='scheduled__2024-09-23T06:00:00+00:00'
[2024-09-23, 09:10:42 CEST] {subprocess.py:63} INFO - Tmp dir root location: /tmp
[2024-09-23, 09:10:42 CEST] {subprocess.py:75} INFO - Running command: ['/usr/bin/bash', '-c', '\n            cd /home/airflow/gcs/dags/dbt/ep_app &&             rm -rf ../../../data/dbt/ep_app/target/* &&             dbt deps &&             dbt build --select tag:ep_app --target-path ../../../data/dbt/ep_app/target\n            ']
[2024-09-23, 09:10:42 CEST] {subprocess.py:86} INFO - Output:
[2024-09-23, 09:10:46 CEST] {subprocess.py:93} INFO - e[0m07:10:46  Running with dbt=1.5.4
[2024-09-23, 09:10:46 CEST] {subprocess.py:93} INFO - e[0m07:10:46  Installing ../other_project_1
[2024-09-23, 09:10:46 CEST] {subprocess.py:93} INFO - e[0m07:10:46  Installed from <local @ ../other_project_1>
[2024-09-23, 09:10:46 CEST] {subprocess.py:93} INFO - e[0m07:10:46  Installing ../other_project_2
[2024-09-23, 09:10:46 CEST] {subprocess.py:93} INFO - e[0m07:10:46  Installed from <local @ ../other_project_2>
[2024-09-23, 09:10:46 CEST] {subprocess.py:93} INFO - e[0m07:10:46  Installing dbt-labs/dbt_utils
[2024-09-23, 09:10:47 CEST] {subprocess.py:93} INFO - e[0m07:10:47  Installed from version 1.3.0
[2024-09-23, 09:10:47 CEST] {subprocess.py:93} INFO - e[0m07:10:47  Up to date!
[2024-09-23, 09:10:51 CEST] {subprocess.py:93} INFO - e[0m07:10:51  Running with dbt=1.5.4
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO - e[0m07:10:53  Registered adapter: bigquery=1.5.4
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO - e[0m07:10:53  Encountered an error:
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO - Runtime Error
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO -   Failed to read package: Runtime Error
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO -     No dbt_project.yml found at expected path /home/airflow/gcs/dags/dbt/ep_app/dbt_packages/dbt_utils/dbt_project.yml
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO -     Verify that each entry within packages.yml (and their transitive dependencies) contains a file named dbt_project.yml
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO - 
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO - 
[2024-09-23, 09:10:53 CEST] {subprocess.py:93} INFO - Error encountered in /home/airflow/gcs/dags/dbt/ep_app/dbt_packages/dbt_utils
[2024-09-23, 09:10:54 CEST] {subprocess.py:97} INFO - Command exited with return code 2
[2024-09-23, 09:10:54 CEST] {taskinstance.py:1826} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/opt/python3.11/lib/python3.11/site-packages/airflow/operators/bash.py", line 210, in execute
    raise AirflowException(
airflow.exceptions.AirflowException: Bash command failed. The command returned a non-zero exit code 2.
[2024-09-23, 09:10:54 CEST] {taskinstance.py:1346} INFO - Marking task as FAILED. dag_id=ep_app_case_items, task_id=process_case_items, execution_date=20240923T060000, start_date=20240923T071041, end_date=20240923T071054
[2024-09-23, 09:10:54 CEST] {warnings.py:109} WARNING - /opt/python3.11/lib/python3.11/site-packages/airflow/utils/context.py:205: AirflowContextDeprecationWarning: Accessing 'execution_date' from the template is deprecated and will be removed in a future version. Please use 'data_interval_start' or 'logical_date' instead.
  warnings.warn(_create_deprecation_warning(key, self._deprecation_replacements[key]))

[2024-09-23, 09:10:54 CEST] {warnings.py:109} WARNING - /opt/python3.11/lib/python3.11/site-packages/airflow/providers/slack/hooks/slack_webhook.py:42: UserWarning: You cannot override the default channel (chosen by the user who installed your app), username, or icon when you're using Incoming Webhooks to post messages. Instead, these values will always inherit from the associated Slack app configuration. See: https://api.slack.com/messaging/webhooks#advanced_message_formatting. It is possible to change this values only in Legacy Slack Integration Incoming Webhook: https://api.slack.com/legacy/custom-integrations/messaging/webhooks#legacy-customizations
  resp = func(*args, **kwargs)

[2024-09-23, 09:10:54 CEST] {base.p