Testing with DBT

I was looking at DBT to automate the testing of Snowflake datavault tables. This would be full fledged data validations that involves complex data transformation rules. These tests would then be repeated for multiple source system data across different domains followed by system integration tests.
I am not clear if DBT supports this type of typical data validations for large no. of objects within the data lake.
Also does this tool perform row-wise data comparison between 2 huge datasets with transformation rules applied? What are the minimum system requirements to install this tool? What are the performance metrics so far if there were huge comparisons performed?
I didnt see this in the list of top 10 or so data test automation solutions. Is this recommended for automated data testing between source to staging to snowflake (datavault) layers involving huge objects and volumes?