Diving Into Data tackles a common and pivotal part of the data ecosystem—don’t trust it until verifying it. There are many approaches to take to ensure data accuracy. Additionally, even if data is accurate, that doesn’t mean it’s always applied correctly. Remember, correlation doesn’t always equal causation. Host TC Riley offers his thoughts on best practices, learning from mistakes, and more on this episode.
Riley said, “Data will tell the whole story most of the time, not all the time. Always consider what the data can’t show you.”
The first tenet of trust but verify is the source. There are many reliable sources for data, but it’s a good idea to be skeptical. Always verify accuracy for external data. That same approach applies to internal data, too. Riley recommends putting a routine quality control check for data systems. Doing this early and often ensures that you don’t end up with tainted data.
The next principle is to consider other factors. Data doesn’t lie, but it doesn’t always paint the complete picture. Riley shared some data errors he made on internal projects by not thinking about external forces. In one story, comparisons were off because the comparisons were not apples to apples.
The third and last area is don’t believe that simulation data always works in the real world. There are many times when decision-makers, analysts, and companies make this mistake. Riley noted that a good example is the recent Texas power grid challenges. “Many were convinced that considering the diversity of energy sources and large geographic spread along with backups, nothing could go wrong, but it did,” Riley added.
The significant outlier of a cataclysmic event wasn’t in the data mix. While organizations shouldn’t let extreme outliers paralyze them, they need to be aware that it’s a possibility so that they can mitigate the risk in the best way.
Listen to previous episodes of Diving into Data here.