Insights for Leaders | 2 to 3 minute read
There is a quiet orthodoxy in most boardrooms and executive teams: if you have enough data, you can make a good decision. It's a comforting idea. It suggests that uncertainty is a resourcing problem, that with more analysis, better dashboards, another round of due diligence, the right answer will eventually reveal itself.
It won't. Not always. And knowing when data stops being the answer is one of the most underrated skills in leadership.
Data is excellent at describing what has happened. It is reasonably good at identifying patterns in stable, well-understood systems. It becomes significantly less reliable when you're making decisions about situations that are genuinely new, where the future won't resemble the past, or where human behaviour, of competitors, customers, regulators, employees, is the critical variable.
The leaders who get into trouble aren't usually the ones who ignored the data. They're the ones who trusted it too completely. They built detailed models on assumptions that felt solid because they were quantified. The model was rigorous. The assumptions were wrong.
Data gives you confidence. That's its value, and its danger.
You're in territory where data alone won't carry you when:
The decision involves a significant discontinuity, a new market, a structural shift, a crisis, where historical patterns have limited relevance. Past performance data is describing a world that no longer exists.
The numbers keep supporting the conclusion that everyone already wants to reach. When data analysis consistently validates the preferred option, it's worth asking whether the analysis was designed to find an answer or to test one.
The debate in the room is about which data to trust, not what the data means. Competing analyses, different methodologies, conflicting benchmarks, at that point, the decision is no longer a data problem. It's a judgement problem dressed in quantitative clothing.
This is where experienced judgement matters, not instinct, but the kind of structured thinking that comes from having navigated similar uncertainty before, and from understanding which questions to ask when the spreadsheet runs out of answers.
It's also where independent perspective becomes genuinely valuable. Not another analysis, but someone with no stake in the outcome who can ask plainly: what are we assuming here that we haven't examined? That question, asked at the right moment, is worth more than another deck.
The honest framing for most high-stakes decisions under uncertainty isn't "what does the data tell us?" It's "given what we know and what we don't, what is the most defensible choice, and what would have to be true for it to be wrong?"
Boards and leadership teams often feel most settled when they have a lot of data. That comfort is understandable. But confidence derived from data volume is not the same as confidence derived from clear thinking about a well-framed problem.
The leaders who navigate uncertainty well have learned to sit with a different kind of confidence, one that doesn't come from certainty about the answer, but from rigour about the process that got them there.
That's a harder thing to build. It's also far more durable.
Next in this series: The Role of Independent Advice