Erkin Ünlü

Software Engineer

Black Box Thinking

Date read October 18, 2020

How much I recommend: 9/10, Go to Amazon or Goodreads for other reviews

This book was given to me as a gift last year. Despite its gorgeous red cover, I postponed my reading of the book until a week ago. And I really wish that I had read it last year, I could have changed my perspective a year earlier.

Introduction

The book's main theme is the openness to learn from failures. Author Matthew Syed basically tells us to embrace our failures and treat them as valuable experiments. Because we all know by now that the world is an inherently complex place and it is therefore impossible to model it accurately in any case. So we should view errors as vital information about how our model is false and how we should fix ourselves.

Healthcare

This would sound like already intuitive for almost everyone but the introduction story tells a very stark comparison of healthcare profession where errors are feared and people are actively trying to cover their errors versus aviation industry where errors are investigated without blaming individuals. This difference in healthcare causes unnecessary loss of life and injuries to millions of people around the world.

Aviation

Then we contrast healthcare with the aviation industry where the title of the book comes from. Every time an accident happens, the black box in the plane (which is not black anymore instead orange to find it easier in the ocean → probably another example of black box thinking and marginal gains which I'll talk about later on) is fetched, and experts investigate everything they can find from the recordings from that black box. Most important issue is, there's definitely no blame game and everyone is encouraged to cooperate fully. This way, after every accident or near miss, aviation industry improves its security bit by bit, or say margin by margin.

Software

I'm a software engineer and black box thinking applies to my field fabulously. When we criticize our industry and our way of working, we are doing a service to ourselves by forcing the people to improve their ways (we can definitely gain much by blaming less by the way). This has led to number of best practices including automated software testing, A/B testing (also known as Randomized Control Trials in academia), log/metric/error capturing in a third party server (very similar to black boxes), peer reviews, post mortems and retrospectives.

Cognitive dissonance

But then comes an another aspect of learning from mistakes: mindset. If you culture/workplace is a closed loop environment (this is a term coined by the author which means errors are bad, experts are infallible) even when you have the necessary tools to be able examine the mistakes, no one would feel safe enough to stick their necks out to try something different. And when a person is of a closed mindset, that person starts to create their own version of the world where things start to detach from the reality. This is called cognitive dissonance and you can observe this behaviour in doctors when they talk about an error as a "one-off thing". By the advent of DNA testing, similar story started to be observed in Criminal Justice systems where wrongly convicted people used DNA testing to prove their innocence but the police and prosecutors who arrested and put them to court couldn't bear the fact that these people are in fact innocent from the start. They literally never changed their minds.

How errors improve us

We know there are two basic approaches to solving problems, top-down and bottom-up. Errors can help in both and I'll start from the easier one: bottom-up. If we can break down a process into smaller parts, it then becomes easier and cheaper to fail in each one of them. This is like writing unit tests for software: if you can divide your solution into small pieces, you can test each separately. The amazing side of this is that feedback is immediate! You quickly learn what is wrong with your implementation and correct your course immediately. This is how British cycling team won the Tour de France and how Google decides the color on their important buttons, by marginal gains resulting from tests.

For the top down approach, failures give us the opportunity to come up with innovative ideas. If you are frustrated enough with a problem you may get that a-ha moment of creative solution. Failures bolster creativity. But this giant leap must be followed with the discipline of bottom up - cheap errors - fast feedback system; otherwise your great idea will just be an entry in a patent registry office whereas people like James Dyson makes billions out of a seemingly very easy solution to vacuum cleaners with bags blocking air filters.

Verdict

Learning from mistakes, fast feedback and marginal gains are notions we should definitely incorporate in our industries wherever we work. We'll all be better off once we change our mindsets to learn from failures and improve ourselves by these failures.