Regular price: $31.50
Buy Now with 1 Credit
Buy Now for $31.50
When you begin this book, it seems as if it will be a straight comparison between the airline safety model of reviewing and learning from accidents (open) and the medical system model for covering up mistakes (closed), and it does describe few powerful illustrative examples from each of those fields. However, it turns out to have quite a few more dimensions and lessons, For example, it also turns its focus on the criminal justice system (closed) and the political system (closed). These analyses alone would make it a good book and support a strong argument that learning from mistakes is hugely important.
However, the author takes it a step further and looks at some of the psychological reasons why all of us find it so difficult to admit mistakes (cognitive dissonance), and how we so naturally create narratives that support our original decisions. Like some of the best books in this genre, the book forces us to admit that we also are subject to the same kinds of biases that make it difficult to create and maintain "open" systems that encourage us to regularly test our ideas, even while it provides one example after another of why mistakes are essential to learning.
Simon Slater is a good narrator: pace, accent, and expression contribute to an excellent audio book.
11 of 11 people found this review helpful
This book is all about failure. It’s about the fact that we hide and stigmatise failure when we should be embracing it - and using it to continuously improve all our enterprises by submitting them to trial and error.
He gives many excellent, moving and gripping examples of contexts where this approach was lacking and resulted in dire consequences: In the medical profession, senior doctors have very high status and self-esteem, and they don’t like to admit their errors. They use euphemisms such as ‘a complication’ or an ‘adverse event’. The author argues that the lack of openness about error means that we are deprived of the opportunity to analyse what went wrong and use this information to continuously improve our systems. He gives a graphic example of a woman who needlessly dies because a group of doctors are finding it difficult to pass a breathing tube during a routine operation. They become fixated with this task and they lose track of time, when they could have performed an emergency tracheotomy – a relatively straightforward lifesaving procedure. The nurse was there ready with the tracheotomy kit - but she only hinted instead of speaking up forcefully, because of the steep authority gradient between her and the doctors.
A second example is criminal law. Since the invention of DNA testing, it has become apparent that our jails are full of innocent people wrongly convicted. But the legal system has been slow to admit its errors and to introduce processes to fix this. Again, high status people, such as investigators and prosecutors are reluctant to admit that they are error prone.
One industry that seems to get this right is aviation. All errors are investigated thoroughly and recommendations are made to change practice. For example, in aviation there have been many crashes resulting when junior members of a team wouldn’t speak up to alert the captain of a danger, because the captain was the commander and speaking up could have resulted in severe rebuke. So the aviation industry changed the culture to a teamwork approach and encouraged all crew members to speak up. This has been a great success, and lessons from this have now been adopted in many medical settings.
In the field of sociology, there was an initiative introduced called ‘Scared Straight’ - designed to put potential delinquents off serious crime by sending them to a prison for 3 hours to spend time with hardened criminals. It appeared to work, and was subsequently adopted Worldwide. But nobody actually tested it to see if it really did work, except to send out some questionnaires. Once it was subjected to rigorous scientific testing using a randomised controlled trial it was shown that this intervention actually increased criminality in the subjects by about 25%.
The point is, you don’t know if something is going to succeed or fail unless you test it. You can’t predict whether something will work or not purely by intuition or because it seems logical – the world is just too complex and there are too many unknown variables. So you should test your idea, then change it and test it again, and so on. This process works the same way that natural selection works in evolution. The entrepreneur who invented the very successful Dyson vacuum cleaner made over 5,000 prototypes and this resulted in an excellent product – he wasn’t afraid of failure, he harnessed it as a tool to drive continuous improvement.
As you have probably guessed if you’ve read this far, I enjoyed this book. It’s interesting and as well as giving an insight into how major institutions and industries could be improved if they embraced failure, it also shares some ideas that we can all apply in our own lives.
4 of 4 people found this review helpful