Trial and error

Trial and error

To err is human. As a species, we’re hardwired to make mistakes. Our cognitive resources are finite; when too much information is thrown at us, or we’re operating in a heightened state of emotion, we tend to switch into efficiency mode.

Our attention span gets stretched thin, impairing our ability to process information, perceive detail and make accurate decisions. It’s why we send emails with typos despite having read them multiple times and accidentally put the milk in the cupboard and the cereal in the fridge. Rather than being caused by careless or malicious behaviour, our errors and blunders are more often the inevitable consequence of normal human limitations.

And yet, few things evoke such anxiety and shame. Since our schooldays, we’ve been taught that mistakes should be crossed out, erased, or punished. Even the most innocent slip-ups are hard to own up to for fear of judgement. And there are legitimate reasons for this. Mistakes can be costly to people, livelihoods and reputations. In some cases, they’re the difference between life and death.

Learning the hard way

We often hear the phrase “human error” used in relation to fatal accidents or system failures. It’s become associated with plane crashes, road collisions and even high-profile disasters like the Chernobyl meltdown.

It’s a simple, two-word expression that captures the catastrophic consequences of our fallibility. It affirms the notion that mistakes are inherently bad and something to be avoided at all costs.

But, as behavioural science tells us, that’s not possible. And that’s OK, because our mistakes actually offer vital insight into how we operate, and whether the systems, structures and tools we live by and interact with have been designed effectively.

Repeated patterns of things “just not working”, like ignored communications, underused employee platforms or late timesheets, are not necessarily evidence of widespread incompetence. These instances are often framed as people problems rather than design problems – where systems don’t match up to real-world behaviours. And when these so-called people problems are met with blame and shame instead of curiosity, we miss out on the chance to understand the factors that led to them.

If we want to build a safer, smarter and more innovative world, we shouldn’t treat error as the enemy. Instead, we should view it as essential data.

In high-risk industries like healthcare, formal systems for analysing and preventing errors are essential to protect the safety of patients and staff. In the UK, all healthcare workers have a professional duty of candour. If something goes wrong, they must be open and honest with colleagues, patients and people in their care.

Ethical principles like this help build a sense of psychological safety, empowering people to feel comfortable speaking up, admitting their mistakes and challenging the status quo.

Amy C. Edmondson, a professor at Harvard Business School, discovered during a research project on psychological safety in hospitals that staff units which reported more errors also reported better patient outcomes.

There are plenty of other industries where error is also welcomed as inevitable and informative. On every Toyota production line, there’s something called an Andon Cord. If someone finds a fault, defect, malfunction or safety issue, they pull the cord to stop part of the line and alert those around them. Colleagues are encouraged to thank the person who pulled the cord, acknowledging the importance of asking for help and spotting an opportunity for improvement.

Failing forward

Mistakes can have unexpected positive consequences beyond simply teaching us how to avoid repeating them. In the exploratory environments where I play, like marketing, product development, research, strategy and service design, uncertainty is part and parcel of the job. If everything works fine the first time, we’re probably not stretching far enough.

The late political scientist and sociologist James G. March, best known for his research on organisational decision-making, argued that innovation is always the result of trial and error. He maintained that when organisations prioritise exploration (trying new things) over exploitation (doing what already works), failure is to be expected.

Think of the tech-bro motto “Move fast and break things”, coined by Mark Zuckerberg and widely adopted across Silicon Valley. While its meaning has soured over the years (along with the Facebook founder’s popularity), the core sentiment still rings true: error is the price of experimentation.

Mistaken machines

There’s a potential future where we won’t need to worry about making mistakes, because AI will be able to take on all the tedious, labour-intensive, high-risk tasks that often trip us up and lead to error.

After all, AI doesn’t get tired, it doesn’t need breaks or have bad days. It’s not impossible to imagine a world in which we no longer need to proofread our emails or pull the cord on a production line.

But for now, at least, AI is almost as error-prone as people are, albeit in different ways. From hallucinating data to creating nonsensical definitions for made-up idioms, today’s chatbots and agents have their own cognitive limitations to reckon with. Plus, even if humans are removed from certain tasks, there’s still a risk of displacing our mistakes, biases, and inaccuracies into AI training data.

In December, a video of a man accidentally causing a humanoid robot to kick him in the groin went viral on social media. The clip’s an apt metaphor for our times, and a clear example of why it’s important to remember that AI has been created by the same flawed, fallible people some claim it can replace.

When mistakes happen, and they will happen, they can be crucial signals that guide us towards improvement and innovation. When people and organisations treat them as valuable data, they unlock opportunities for learning and discovery. So, next time someone makes a mistake, look beyond the error – it could be that they’ve stumbled onto something important.

Next up in Edition #08

  • Lessons beyond the speed limit

    Lessons beyond the speed limit▶︎

    Effective communication begins by understanding other people’s expectations and assumptions. Doing this requires us to burst out of our bubbles. Alex Wilman reflects on how rare this is, but how powerful it can be.

  • Back to the future?

    Back to the future?▶︎

    Who remembers the metaverse? It was once hyped as the next generation of everything, but at a cost of billions. Josh Handley explores why new technologies don’t always unlock the future.

View all editions