The Logic of Failure

Failure is something we all have to endure from time to time. But how do we react to failure? And what are the implications? In his book “Black Box Thinking,” Matthew Syed nudges us to not look at failure as shameful and stigmatizing, but exciting and enlightening.

This is not, however, a self-help book. Matthew brings forth two most safety critical industries in the world today: healthcare and aviation. While these organizations have differences in psychology, culture and institutional change, the most profound difference is in their divergent approaches to failure.

In airline industry, the black boxes, for example, records instructions, conversations, and more. If there is an accident, the boxes are opened, the data is analyzed, and the reason for the accident excavated. This ensures the procedures can be changed so that the same error never happens again. Through this method aviation has attained an impressive safety record.

Do watch Air Crash Investigation series on Disney/Hotstar!https://www.hotstar.com/my/tv/air-crash-investigation/8229

In healthcare, however, things are different. Millions are injured or die each year as a result of preventable medical errors. More like two 747’s crashing every day — if one counts such deaths in North America alone. This is even when medical professionals are going about their business with the diligence and concern you would expect.

We would not tolerate that degree of preventable harm in any other forum. Why, then, do so many mistakes happen? Complexity? Scarce resources? Research around these preventable deaths showed “signatures.” With open reporting and honest evaluation, these errors could be spotted and reforms put in place to stop them from happening again, as happens in aviation. But, all too often, they aren’t.

For reasons both prosaic and profound, a failure to learn from mistakes has been one of the single greatest obstacles to human progress. Confronting this could not only transform healthcare, but business, politics and much else besides. A progressive attitude to failure turns out to be a cornerstone of success for any institution.

The author goes on to examine how we respond to failure, as individuals, as businesses, as societies. How do we deal with it, and learn from it?


Anatomy of Failure-Denial

When failure is related to something important in our lives — our job, our role as a parent, our wider status — it is taken to a different level altogether. When our professionalism is threatened, we are liable to put up defenses.

For senior doctors, who have spent years in training and have reached the top of their profession, being open about mistakes can be almost traumatic.

Society, as a whole, has a deeply contradictory attitude to failure. We also have a deep instinct to find scapegoats. We all have a sophisticated ability to delete failure from memory.

The net effect is simple: it obliterates openness and spawns cover-ups. It destroys vital information we need in order to learn. By concealing mistakes, doctors are unable to learn from them. Look at how they talk about errors. Mistake morphs into a complication. Unanticipated outcomes. Technical error. Doctors say “we did the best we could.” (This is the most common response to failure in the world today.) Above all, they learn not to tell the patient anything.

The problem is not just about the consequences of failure. It is also about the attitude towards failure. Only by redefining failure will we unleash progress, creativity and resilience.

In aviation, on the other hand, learning from failure is hardwired into the system. Pilots are generally open and honest about their own mistakes (Crash landings, near misses). When they make a mistake, provided they file a report within 10 days, they enjoy immunity. They also are not intimidated about admitting errors because they recognize their value. Many planes are also fitted with data systems that automatically send reports when parameters have been exceeded. There are powerful, independent bodies designed to investigate crashes. Failure is not regarded as an indictment of the specific pilot who messes up, but a precious learning opportunity for all pilots, all airlines and all regulators.

Instead of concealing failure, or skirting around it, aviation has a system where failure is data rich. Learn from the mistakes of others. You can’t live long enough to make them all yourself!

Just some of the learnings from investigating airline accidents:

  1. Social hierarchies inhibit assertiveness. We talk to those in authority (First Officer to a Pilot in command, for ex) in what is called a ‘mitigated language.’ It’s imperative we have a meeting on Monday. Or this: Don’t worry if you’re busy, but it might be helpful if you could spare half an hour on Monday. This deference makes sense in many situations, but it can be fatal when a 90 ton aero plane is running out of fuel. In airlines, First Officers were taught assertiveness procedures. The same hierarchy also exists in operating theaters! A system insensitive to the limitations of human psychology is a big problem!
  2. In emergencies, crew loose their perception of time. If you focus on one thing, you will loose awareness of other things.
  3. When people don’t interrogate errors, they sometimes don’t even know they have made one!

Black-box Thinking?

For organizations beyond aviation, it is not about creating a literal black box. Rather, it is about the willingness and tenacity to investigate the lessons that often exist when we fail, but which we rarely exploit. It is about creating systems and cultures that enable organizations to learn from errors, rather than being threatened by them.

Failure is rich in learning opportunities for a simple reason: in many of its guises, it represents a violation of expectation. Failures are inevitable because the world is complex and we never fully understand its subtleties. The model is not the system. Failure is thus a signpost. It reveals a feature of our world we hand’t grasped fully and offers vital clues about how to update our models, strategies and behaviors.

When failure is most threatening to our ego is when we need to learn most of all!


Two Mistakes

Psychologists often make a distinction between mistakes where we already know the right answer and mistakes where we don’t.

While the rest of the book explores both these types of mistakes, the crucial point is that in both scenarios, error is indispensable to the process of discovery.

Seeing the Data You Can’t

Bullet Free airplane, Bullet hole locations in returning airplanes

During World War 2, many planes riddled with gunfire all over wings and fuselage were observed. Military command thought they would place armour on the areas of the plane where there were holes. It was common sense.

A Mathematician, Abraham Wald, disagreed. They were only considering the planes that had returned. They were not taking into account the planes that had not returned (those shot down). The observable bullet holes suggested that the area around the cockpit and tail didn’t need reinforcing because it was never hit. In fact, the planes that were hit in these places were crashing because this is where they were most vulnerable.

This is a powerful example to remind you to take into account data you cannot immediately see. Learning from failure is anything but straight-forward. Organizations need new and better ways to go beyond lessons that are superficial. Wald is a black-box thinker, par excellence! Knowledge does not progress merely by gathering confirmatory data, but by looking for contradictory data!


The Paradox of Success

This is the paradox of success: it is built upon failure.

Sullenberger, the pilot who landed his plane on Hudson river offered this beautiful gem of wisdom:

Everything we know in aviation, every rule in the rule book, every procedure we have, we know because someone somewhere died. We have purchased at great cost, lessons literally bought with blood that we have to preserve as institutional knowledge and pass on to succeeding generations. We cannot have the moral failure of forgetting these lessons and have to relearn them.

Science and Aviation

Failure is hardwired into both the logic and spirit of scientific progress. Mankind’s most successful discipline has grown by challenging orthodoxy and by subjecting ideas to testing. Individual scientists may sometimes be dogmatic but, as a community, scientists recognize that theories, particularly those at the frontiers of our knowledge, are often fallible or incomplete. It is by testing our ideas, subjecting them to failure, that we set the stage for growth.

Science is not just about method, it is also about a mindset. At its best, it is driven forward by restless spirit, an intellectual courage, a willingness to face up to failures and to be honest about key data, even when it undermines cherished beliefs. It is about method and mindset.

Aviation is different from science but it is underpinned by a similar spirit.


Turning the lights on

Why do science students don’t study failed theories? Hollow Earth Theory. Electron Cloud Model… By looking only at the theories that survived, we don’t notice the failures that made them possible. A blind spot!

In many cases, the only way to drive improvement is to find a way of ‘turning the lights on.’ Without access to the ‘error signal,’ one could spend years in training or in a profession without improving at all.

Success is always the tip of an iceberg. A safe aircraft. A true expert. Outside of our view and awareness is a mountain of necessary failure.

Toyota Production System

If anybody on the production line is having a problem, or observes an error, they pull a cord which halts production across the plant. Senior executives rush over to see what has gone wrong and, if an employee is having difficulty performing their job, help them. The error is then assessed, lessons learned, and the system adapted.

The underlying principle need not just apply to car assembly lines. If a culture is open and honest about mistakes, the entire system can learn from them. This is the way you gain improvements.

You can’t understand something you hide.


Cognitive Dissonance

One of the key objectives of the criminal justice system is to ensure that people aren’t punished for crimes they didn’t commit. But miscarriages of justice have a quite different significance.

Wrongful convictions are in many ways, like plane crashes. They offer an opportunity to probe what went wrong. Police investigation. Accuracy of identifications in line-ups. False confessions. How evidence was presented to court. Deliberations of the jury. Activities of the judge. By learning from failures (author sites several cases where DNA based evidence showed so many convictions as not guilty, especially in cases before DNA tests were available), we can design reforms that ensure similar mistakes don’t happen again.

But people don’t like to admit to failure. Even police, prosecutors, judges. Professionals are so fearful of their mistakes that they cover them up in various ways, making it impossible to learn from them. This tendency characterizes the response to failure in many areas of our world.

Why? What are the precise psychological mechanisms that underpin error denial? What are the contours of its subtle evasions? Why are they perpetrated by smart, honest people?

‘Cognitive Dissonance’ is a term used to describe the inner tension we feel when, among other things, our beliefs are challenged by evidence. Most of us like to think of ourselves as rational and smart. We reckon we are pretty good at reaching sound judgements. We don’t like to think of ourselves as dupes. That is why when we mess up, particularly on big issues, our self-esteem is threatened. We feel uncomfortable, twitchy.

We end up choosing this: denial. We reframe the evidence. We filter it, we spin it, we ignore it altogether. We build defensive walls and deploy our cognitive filters. This is what makes Doctors reframe their mistakes. They are not dishonest people; they are often unaware of the reframing exercise because it is largely subconscious.

Intelligence and seniority when allied to cognitive dissonance and ego is one of the most formidable barriers to progress in the world today.

Our memories too, are not as reliable as we think. We do not encode HD movies and access them at will. Rather, memory is a system dispersed through out the brain, and is subject to all sorts of biases. Memories are suggestible. We often assemble fragments of entirely different experiences and weave them together into what seems like a coherent whole. With each recollection, we engage in editing. We try to make the memory fit with what we know rather than what we saw.

We need to create better systems (in criminal justice and other areas) that is sensitive to flaws in human memory!


Blame Culture

Author shares two chilling stories of how a social worker and even a pilot were victims of a blame culture.

When the social workers (for neglecting a child under family abuse) were subjected to a media and witch trial, many were convinced that the social work profession would improve its performance. The idea is that it will force people to sit up and take responsibility. In fact, social workers started leaving the profession en masse. The numbers entering the profession also plummeted. Those who stayed had to deal with more case loads. They also started to intervene more aggressively, terrified of the consequences if a child under their supervision was harmed. The number of children removed from their families soared. Crucially, defensiveness started to infiltrate every aspect of social work. Harm done by media driven attempt to “increase accountability” was high indeed!

Trying to increase discipline and accountability in the absence of a just culture has precisely the opposite effect. It destroys morale, increases defensiveness and drives vital information deep under ground.


Self-Handicapping

How far people are prepared to go to protect their ego at the expense of their own long-term success? Author cites an example of a student group indulging in frolicking and drinking before an exam. Were they terrified of underperforming, so worried that the exam might reveal that they were not very clever, that they needed an alternative explanation for possible failure?

Excuses in life are typically created retrospectively. We have all pointed to a bad night’s sleep, or a cold, or a dog being sick, to justify poor performance. We and others see through these excuses too. But self-handicapping is more sophisticated. This is where the excuse is not cobbled together after the event, but actively engineered beforehand.

Viewed through the prism of Fixed Mindset, it makes perfect sense. One can admit to a minor flaw [drinking] in order to avoid admitting to a much more threatening one [I am not as bright as I like to think].

What is the point of preserving self esteem that is so brittle that it can’t cope with failure? Self esteem can cause us to jeopardize learning if we think it might risk us looking anything less than perfect. What we really need is resilience: the capacity to face up to failure, and to learn from it. Ultimately, that is what growth is all about.


Fixed vs Growth Mindset

Why some people learn from their mistakes while others don’t? The difference is ultimately about how we conceptualize our failures. Those in the growth mindset think about error in a different way from those in the Fixed Mindset. Because they believe that progress is driven, in large part, by practice, they naturally regard failure as an inevitable aspect of learning.

Those who think that success emerges from talent and innate intelligence, on the other hand, are far more likely to be threatened by their mistakes. They will regard failure as evidence that they don’t have what it takes, and never will. They are going to be more intimidated by situations in which they will be judged. Failure is dissonant.

Do read Carol Dweck for more on growth mindset.


What can be done?

The first and most important issue is to create a revolution in the way we think about failure. For centuries, errors of all kinds have been considered embarrassing, morally egregious, almost dirty. This conception still lingers today. It is why children don’t dare to put their hands up in class to answer questions, why doctors reframe mistakes, why politicians resist running rigorous tests on their policies, and why blame and scapegoating are so endemic.

As business leaders, teachers, coaches, professionals and parents, we have to transform this notion of failure. We have to conceptualize it not as dirty and embarrassing, but as bracing and educative. This is the notion we need to instill in our children: that failure is a part of life and learning, and that the desire to avoid it leads to stagnation.

We should praise each other for trying, for experimenting, for demonstrating resilience and resolve, for daring to learn through our own critical investigations, and for having the intellectual courage to see evidence for what it is rather than what we want to be.

If we only praise each other for getting things right, for perfection, for flawlessness, we will insinuate, if only unintentionally, that it is possible to succeed without failing. We have to challenge this misconception, in our lives, and in our organizations.

The book emphasizes a growth oriented mindset while outlining some techniques in practice: lean startups, MVP’s, pre-mortems (imagine the project went horribly wrong even before it began and write down the reasons), pilot schemes (learn on a small scale), randomized control trials (RCTs).

No one can possibly give us more service than by showing us what is wrong with what we think or do; and the bigger the fault, the bigger the improvement made possible by its revelation. The man who welcomes and acts on criticism will prize it almost above friendship: the man who fights it out of concern to maintain his position is clinging to non-growth.Bryan Magee

Leave a comment