On Data Quality

At Interos, my team built a platform for assembling and maintaining the world’s business supply chains. That required data. That required a lot of data.

Our data came from everywhere and just ensuring we received what we were paying for was a challenge. We were able to rapidly mature our data governance with a combination of buying & building and developed solid visibility into data deliveries, freshness, and volume. But we couldn’t buy a service that would monitor the data values themselves in a scalable fashion.

No coincidence…I’ve been developing just such a service with the team at Qualytics. The Qualytics platform observes what the shape & characteristics (to the field level) of your data are and deploys deductive learning to infer data quality checks from your actual data values. Those checks are then asserted against newly arriving data to identify anomalies. If that’s not compelling enough, the platform will automatically update those data quality checks as your data evolves.

Automated Unit testing for your data.