DBT Unit test - Issue/Question on data format on fixture CSV files - Array of string

The problem I’m having

I can not Unit test an expected output with CSV fixture file when my output contains an Array of String.
If I run the unit test without any value on my column, I get the following message

actual differs from expected:

@@ ,customer_id,customer_name,customer_featuresUsed
→  ,2          ,My org 1     ,"[]→[""claims"", ""feedback_v1"", ""feedback_v2""]"
+++,3          ,My org 2     ,[]

If I try to change the CSV output to have a “[”“claims”", "“feedback_v1"”, ““feedback_v2"”]”, I get

   Database Error
    Invalid cast from STRING to ARRAY<STRING> at [68:120]

If I try to change the CSV output to have [“claims”, “feedback_v1”, “feedback_v2”], my CSV file gets invalid and I get the following error

    Invalid column name: 'none' in unit test fixture for expected output.

I am a looking to unit test my dbt stack using the unit test feature launched on dbt 1.8.
I have software engineer habit to organize my codes using dedicated folders for testing, so I organize all my inputs and output files using the CSV syntax that I find easier to debug/read than SQL format.

Is there a way to use the CSV format in a file to represent ARRAY value ?
(I am using BigQuery as a data warehouse)

I did saw this https://discourse.getdbt.com/t/workaround-for-arrays-not-supported-in-unit-test-expected-output/13762|topic mentioning that I could use SQL format, but:
• SQL format is a nightmare to write/read when organized in files
• It would quite strange to have all my test inputs/outputs in CSV and only the ones with ARRAY format in SQL
So I would like to know if some of you are already using the Unit Test framework with CSV files in dedicated folders ?
If so, have you already experienced any issues with data format with ARRAY fields ? :thinking_face:

Note: @Simon Duvergier (Shipup) originally posted this reply in Slack. It might not have transferred perfectly.