The problem I’m having
I can not Unit test an expected output with CSV fixture file when my output contains an Array of String.
If I run the unit test without any value on my column, I get the following message
actual differs from expected:
@@ ,customer_id,customer_name,customer_featuresUsed
→ ,2 ,My org 1 ,"[]→[""claims"", ""feedback_v1"", ""feedback_v2""]"
+++,3 ,My org 2 ,[]
If I try to change the CSV output to have a “[”“claims”", "“feedback_v1"”, ““feedback_v2"”]”, I get
Database Error
Invalid cast from STRING to ARRAY<STRING> at [68:120]
If I try to change the CSV output to have [“claims”, “feedback_v1”, “feedback_v2”], my CSV file gets invalid and I get the following error
Invalid column name: 'none' in unit test fixture for expected output.
I am a looking to unit test my dbt stack using the unit test feature launched on dbt 1.8.
I have software engineer habit to organize my codes using dedicated folders for testing, so I organize all my inputs and output files using the CSV syntax that I find easier to debug/read than SQL format.
Is there a way to use the CSV format in a file to represent ARRAY value ?
(I am using BigQuery as a data warehouse)