Black Box Thinking

  • by Matthew Syed
  • Narrated by Simon Slater
  • 12 hrs and 14 mins
  • Unabridged Audiobook

Publisher's Summary

Nobody wants to fail. But in highly complex organizations, success can happen only when we confront our mistakes, learn from our own version of a black box, and create a climate where it's safe to fail.
We all have to endure failure from time to time, whether it's underperforming at a job interview, flunking an exam, or losing a pickup basketball game. But for people working in safety-critical industries, getting it wrong can have deadly consequences. Consider the shocking fact that preventable medical error is the third-biggest killer in the United States, causing more than 400,000 deaths every year. More people die from mistakes made by doctors and hospitals than from traffic accidents. And most of those mistakes are never made public because of malpractice settlements with nondisclosure clauses.
For a dramatically different approach to failure, look at aviation. Every passenger aircraft in the world is equipped with an almost indestructible black box. Whenever there's any sort of mishap, major or minor, the box is opened, the data is analyzed, and experts figure out exactly what went wrong. Then the facts are published and procedures are changed, so the same mistakes won't happen again. By applying this method in recent decades, the industry has created an astonishingly good safety record.
Few of us put lives at risk in our daily work, as surgeons and pilots do, but we all have a strong interest in avoiding predictable and preventable errors. So why don't we all embrace the aviation approach to failure rather than the health-care approach? As Matthew Syed shows in this eye-opening audiobook, the answer is rooted in human psychology and organizational culture.
Syed argues that the most important determinant of success in any field is an acknowledgment of failure and a willingness to engage with it....

More

See More Like This

Customer Reviews

Most Helpful

A multi-level message, well written and well read

When you begin this book, it seems as if it will be a straight comparison between the airline safety model of reviewing and learning from accidents (open) and the medical system model for covering up mistakes (closed), and it does describe few powerful illustrative examples from each of those fields. However, it turns out to have quite a few more dimensions and lessons, For example, it also turns its focus on the criminal justice system (closed) and the political system (closed). These analyses alone would make it a good book and support a strong argument that learning from mistakes is hugely important.

However, the author takes it a step further and looks at some of the psychological reasons why all of us find it so difficult to admit mistakes (cognitive dissonance), and how we so naturally create narratives that support our original decisions. Like some of the best books in this genre, the book forces us to admit that we also are subject to the same kinds of biases that make it difficult to create and maintain "open" systems that encourage us to regularly test our ideas, even while it provides one example after another of why mistakes are essential to learning.

Simon Slater is a good narrator: pace, accent, and expression contribute to an excellent audio book.
Read full review

- Loren

Epic Fail

This book is all about failure. It’s about the fact that we hide and stigmatise failure when we should be embracing it - and using it to continuously improve all our enterprises by submitting them to trial and error.

He gives many excellent, moving and gripping examples of contexts where this approach was lacking and resulted in dire consequences: In the medical profession, senior doctors have very high status and self-esteem, and they don’t like to admit their errors. They use euphemisms such as ‘a complication’ or an ‘adverse event’. The author argues that the lack of openness about error means that we are deprived of the opportunity to analyse what went wrong and use this information to continuously improve our systems. He gives a graphic example of a woman who needlessly dies because a group of doctors are finding it difficult to pass a breathing tube during a routine operation. They become fixated with this task and they lose track of time, when they could have performed an emergency tracheotomy – a relatively straightforward lifesaving procedure. The nurse was there ready with the tracheotomy kit - but she only hinted instead of speaking up forcefully, because of the steep authority gradient between her and the doctors.

A second example is criminal law. Since the invention of DNA testing, it has become apparent that our jails are full of innocent people wrongly convicted. But the legal system has been slow to admit its errors and to introduce processes to fix this. Again, high status people, such as investigators and prosecutors are reluctant to admit that they are error prone.

One industry that seems to get this right is aviation. All errors are investigated thoroughly and recommendations are made to change practice. For example, in aviation there have been many crashes resulting when junior members of a team wouldn’t speak up to alert the captain of a danger, because the captain was the commander and speaking up could have resulted in severe rebuke. So the aviation industry changed the culture to a teamwork approach and encouraged all crew members to speak up. This has been a great success, and lessons from this have now been adopted in many medical settings.

In the field of sociology, there was an initiative introduced called ‘Scared Straight’ - designed to put potential delinquents off serious crime by sending them to a prison for 3 hours to spend time with hardened criminals. It appeared to work, and was subsequently adopted Worldwide. But nobody actually tested it to see if it really did work, except to send out some questionnaires. Once it was subjected to rigorous scientific testing using a randomised controlled trial it was shown that this intervention actually increased criminality in the subjects by about 25%.

The point is, you don’t know if something is going to succeed or fail unless you test it. You can’t predict whether something will work or not purely by intuition or because it seems logical – the world is just too complex and there are too many unknown variables. So you should test your idea, then change it and test it again, and so on. This process works the same way that natural selection works in evolution. The entrepreneur who invented the very successful Dyson vacuum cleaner made over 5,000 prototypes and this resulted in an excellent product – he wasn’t afraid of failure, he harnessed it as a tool to drive continuous improvement.

As you have probably guessed if you’ve read this far, I enjoyed this book. It’s interesting and as well as giving an insight into how major institutions and industries could be improved if they embraced failure, it also shares some ideas that we can all apply in our own lives.
Read full review

- Mark

Book Details

  • Release Date: 11-03-2015
  • Publisher: Penguin Audio