To Succeed, Fail
When we talk about how people have created a successful product, service or idea, we usually look backwards to pick out key moments in their journey that resulted in that success. This form of analysis and explanation simplifies the process, making it seem straightforward, certain and relatively risk free.
I think this storytelling says a lot about how we really don’t like to fail. So much so that we edit the past to make it appear that success is more ceartain than it actually is. This is a problem because failure is a crucial part of experimenting and learning in a complex world. I see this hesitation from people every day in the projects I work on. So the latest book I’ve been reading, ‘Adapt - Why Success Always Starts With Failure’ by Tim Hardford seems timely and relevant. Let me explain.
Failure, as most people would define it, is to be ‘unsuccessful in achieving a goal’. With simple problems and an acceptance of well-understood solutions, it’s harder to fail. For example, imagine I go to the kitchen to make lunch and I see peanut butter, bread, jam and chicken. I can take the safe option option and make a peanut butter and jelly (jam) sandwich. What if I am happy to experiment? Maybe I make a Chicken with Peanut Butter and Jelly Satay. Perhaps I’ve just invented something new. It wouldn’t have happened if I wasn’t willing to fail.
More complex problems present more risk. What about choosing a set of classes to study in higher education? Meeting a new potential partner? Creating a new product, service, idea or social movement? What about endemic obesity, job automation, climate change, equality with diversity? These are not simple problems. They can be tough to frame and there may be many conflicting answers. To solve these complex problems, we use slogans about ‘being brave and innovating’, but our incentives reward playing it safe, making small iterations and generally avoiding failure.
Real world systems systems don’t work that way. They don’t play it safe. With evolution, failure is the actual mechanism by which nature itself blindly and constantly ‘tries’ things out in the long and never ending story of life. Inventors try out new combinations of materials and technological capabilities, failing along the way. Scientists try out theories, often having to discard their theories if the evidence cannot support them. The bigger the problems get, the more we are likely to fail.
If we are attentive, our failures can be tremendously valuable. The image that leads this article is of the famous chemist, William Perkin. In 1856, William Perkin was trying to synthesise quinine, an anti-malaria treatment. His experiments with coal tar failed, but produced a strange purple liquid. He realised it made an exceptional dye, which went on to dominate the Victorian colour palette as the colour mauve. Years later, out of the same coal tar chemistry we ended up with modern chemotherapy. Quite a useful failure indeed.
Seth Godin (in his book ‘Linchpin’) makes an argument that we are taught to be afraid of failure from an early age. This comes from being part of a modern educational system that was created to respond to the needs of industrialisation. The need, from the 1920s onward, was for a ‘basic skills’ workforce at scale. It created a model of education focused on learning specific answers to a defined set of questions, where answers were right or wrong or easy to grade. Perhaps there is also a deeper psychology at work here; ancient parts of our behaviour that make us both interested in and afraid of the unknown.
It’s hard to work in business and be comfortable about failure. Classic project management usually seeks to avoid failure. A study by PricewaterhouseCoopers, proposed that only 2.5% of the companies successfully completed 100% of their projects (out of a sample of 10,640 projects from 200 companies). For many, this is taken as a warning bell of project management practices in peril. I’m more curious about the companies with no failed projects. I wonder how many risks they took in their work.
The crux of the problem is this; the most interesting problems in the world, those involving technology, society and culture, are too complex to be simulated or predicted with certainty. Failure is intrinsic to learning about how things really work. This implies a very different model of science, education, design, and policy. One where we learn to incentivise and embrace an emotional mindset of experimentation.
Keep in mind that we don’t need to destroy ourselves in doing it. I’ve seen plenty of endorsements that a ‘fearless’ mindset is a prerequisite for success. A total lack of fear seems to create a different kind of problem, one where we are insensitive to our own potential destruction. Where we try everything and anything, without any regard to the consequences. Peter Palchinsky, an enterprising Russian engineer from the early 1900s, proposed three rules for failure that both encourage experimentation, but provide just enough boundaries for safety:
1. Seek out ideas and try new things.
2. When trying something new, do it on a scale where failure is survivable.
3. Seek out feedback and learn from your mistakes as you go along.
Think especially on the second point, ‘When trying something new, do it on a scale where failure is survivable.’ A statistic from a Gartner survey suggests that projects with budgets larger than $1 million dollars are 50% more likely to fail than projects with budgets below $350,000. Perhaps larger budgets engage our aversion to sunk costs, forcing us further and further down a non-productive path.
Whatever the cause, it’s baffling that we are willing to invest so much for so little experimentation. Wouldn’t it be better to invest $50,000 in ten experiments? With the hope of finding one solution that we are more confident works. Rather than to spend another $500,000 to focus efforts on the one solution? Granted, each $50,000 ‘failure’ feels expensive. It feels like money wasted. But it really isn’t. It’s the price of embracing failure. And it might give us a working solution to a complex problem, rather than investing $1 million, just to find out something didn’t work.
It is a dark twist to the tale that Peter Palchinsky, the man behind the three rules to failure, was executed for speaking out against the government. Palchinsky suggested it was a fallacy to centrally plan complex problems without sensitivity to local issues. We really don’t like to fail. We especially don’t like it when others point out our failings.
It is a stark reminder. We need a new culture of experimentation that embodies a willingness to fail and learn.
Hardford, Tim (2012) Adapt - Why Success Always Starts with Failure. Picador.
Godin, Seth (2011) Linchpin: Are You Indispensable?. Penguin Publishing Group.
St Clair, Sassia (2016) The Secret Lives of Colour. John Murray: Great Britain.