Schema mismatch on dbt python model run

Simple dbt python model, 5 fields( 2 INTEGER, 1 TIMESTAMP, 2 STRING). All NULLABLE.
The model runs successfully only the first time , creates a BigQuery table, populates it with data. Any subsequent model run causes an error:
" Error in custom provider, java.lang.IllegalArgumentException: com.google.cloud.bigquery.connector.common.BigQueryConnectorException$InvalidSchemaException: Destination table’s schema is not compatible with dataframe’s schema
"
even if it runs over the same data… Compared the schemas of source and destination tables, seem same. Spark printSchema():
|-- customer_id: integer(nullable = true)
|-- created: timestamp (nullable = true)
|-- attempt_number: integer (nullable = false)
|-- status_desc: string (nullable = true)
|-- payment_type: string (nullable = true)

Bigquery schema:

  • name=customer_id, type=INTEGER, max_length=None, mode=NULLABLE, precision=None, scale=None
  • name=created, type=TIMESTAMP, max_length=None, mode=NULLABLE, precision=None, scale=None
  • name=attempt_number, type=INTEGER, max_length=None, mode=NULLABLE, precision=None, scale=None
  • name=status_desc, type=STRING, max_length=None, mode=NULLABLE, precision=None, scale=None
  • name=payment_type, type=STRING, max_length=None, mode=NULLABLE, precision=None, scale=None

Run it on Dataproc cluster with bigquery connector 0.34. When run on dataproc serverless (which is using bigquery connector 0.22 by default) no such issue occurs.

anybody run into the same problem? thank you!