Can't run any sql files because something called "_deep_map_render" was passed datetime.date instead of anything else

The problem I’m having

I’m not able to run any SQL files, and whenever I try I get this error:

Running with dbt=1.5.0
Encountered an error:
Runtime Error
  in _deep_map_render, expected one of (<class 'list'>, <class 'dict'>, <class 'int'>, <class 'float'>, <class 'str'>, <class 'NoneType'>, <class 'bool'>), got <class 'datetime.date'>

I was using the date_diff function when this error popped up, but even when I reverted everything to the master branch, which can run on my colleagues computers, no files are able to run. Instead, they all give the error above.

What I’ve already tried

  • Reverting to master
  • Looking at the logs file (didn’t see anything related)
  • Scouring the web to figure out what _deep_map_render does or where it is
  • Asking others on my team

Recent logs messages

e[0m13:35:43.420571 [info ] [MainThread]: Connection:
e[0m13:35:43.420788 [info ] [MainThread]:   method: oauth
e[0m13:35:43.420930 [info ] [MainThread]:   database: current-production
e[0m13:35:43.421062 [info ] [MainThread]:   schema: dbt_<matt.dorros@current.com>
e[0m13:35:43.421192 [info ] [MainThread]:   location: None
e[0m13:35:43.421318 [info ] [MainThread]:   priority: None
e[0m13:35:43.421446 [info ] [MainThread]:   timeout_seconds: None
e[0m13:35:43.421568 [info ] [MainThread]:   maximum_bytes_billed: None
e[0m13:35:43.421693 [info ] [MainThread]:   execution_project: current-production
e[0m13:35:43.421821 [info ] [MainThread]:   job_retry_deadline_seconds: None
e[0m13:35:43.421943 [info ] [MainThread]:   job_retries: 1
e[0m13:35:43.422067 [info ] [MainThread]:   job_creation_timeout_seconds: None
e[0m13:35:43.422191 [info ] [MainThread]:   job_execution_timeout_seconds: None
e[0m13:35:43.422312 [info ] [MainThread]:   gcs_bucket: None
e[0m13:35:43.425921 [debug] [MainThread]: Acquiring new bigquery connection 'debug'
e[0m13:35:43.426395 [debug] [MainThread]: Opening a new connection, currently in state init
e[0m13:35:43.771630 [debug] [MainThread]: On debug: select 1 as id
e[0m13:35:44.744217 [debug] [MainThread]: BigQuery adapter: https://console.cloud.google.com/bigquery?project=current-production&j=bq:US:8dc7e629-6c26-45f3-b658-6a85a8edaf8c&page=queryresults
e[0m13:35:44.745925 [info ] [MainThread]:   Connection test: [e[32mOK connection oke[0m]

e[0m13:35:44.746531 [info ] [MainThread]: e[32mAll checks passed!e[0m
e[0m13:35:44.747016 [info ] [MainThread]: Could not load dbt_project.yml

e[0m13:35:44.748571 [debug] [MainThread]: Command `dbt debug` succeeded at 13:35:44.748159 after 2.18 seconds
e[0m13:35:44.749437 [debug] [MainThread]: Connection 'debug' was properly closed.
e[0m13:35:44.750127 [debug] [MainThread]: Sending event: {'category': 'dbt', 'action': 'invocation', 'label': 'end', 'context': [<snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x103699cf0>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x11d432c80>, <snowplow_tracker.self_describing_json.SelfDescribingJson object at 0x11d47d1b0>]}
e[0m13:35:44.750838 [debug] [MainThread]: Flushing usage events

Turns out I needed to update my dbt version. All set here now, not sure how to close the ticket out though