скачать книгу бесплатно
Queen, rock group
St Peter’s Church seems untouched by the impatient swirl of downtown Vienna. It stands in a narrow square, tucked away from the noisy shopping streets that criss-cross the Austrian capital. Buildings lean in from all sides like soldiers closing ranks. Visitors often wander past without even noticing the church’s delicious baroque façade and green domes.
Stepping through the immense wooden doors is like passing through a wormhole to a time when there were few reasons to rush. Gregorian chants whisper from hidden speakers. Candles cast flickering light on gilded altarpieces and paintings of the Virgin Mary. The smell of burning incense sweetens the air. A stone staircase, winding and weathered, leads down into a crypt dating back a thousand years. With thick walls blocking out mobile phone signals, the silence feels almost metaphysical.
I have come to St Peter’s to discuss the virtues of slowing down. It is a soirée for business people, but some clergy are also present. At the end of the evening, when most of the guests have dispersed into the Viennese night, Monsignor Martin Schlag, resplendent in his purple cassock, comes up to me, a little sheepishly, to make a confession. ‘As I was listening to you, I suddenly realised how easy it is for all of us to get infected by the impatience of the modern world,’ he says. ‘Lately, I must admit, I have been praying too fast.’
We both laugh at the irony of a man of the cloth behaving like a man in a suit, but his transgression underlines just how deep the quick-fix impulse runs. After all, prayer may be the oldest ritual for solving problems. Throughout history and across cultures, our ancestors have turned to gods and spirits in times of need, seeking help in tackling everything from floods and famine to drought and disease. Whether praying can actually solve problems is a matter of debate, but one thing is clear: no god has ever offered succour to those who pray faster. ‘Prayer is not meant to be a shortcut,’ says Monsignor Schlag. ‘The whole point of praying is to slow down, listen, think deeply. If you hurry prayer, it loses its meaning and power. It becomes an empty quick fix.’
If we are going to start solving problems thoroughly, we must first understand our fatal attraction to speedy solutions. We need to know why even people like Monsignor Schlag, who devote their lives to serene contemplation in places like St Peter’s, still fall for the quick fix. Are we somehow hardwired to reach for the duct-tape? Does modern society make it harder to resist peeing on frozen legs?
After my encounter with the monsignor, I turn to a secular expert on the workings of the human brain. Peter Whybrow is a psychiatrist and director of the Semel Institute for Neuroscience and Human Behavior at the University of California in Los Angeles. He is also the author of a book called American Mania, which explores how brain machinery that helped early man survive in a world of privation makes us prone to gorging in the modern age of plenty. Along with many in the field of neuroscience, he believes our addiction to the quick fix has physiological roots.
The human brain has two basic mechanisms for solving problems, which are commonly known as System 1 and System 2. The first is fast and intuitive, almost like thinking without thinking. When we see a lion eyeing us from across a watering hole, our brains instantly map out the best escape route and send us hurtling towards it. Quick fix. Problem solved. But System 1 is not just for life-or-death situations. It is the shortcut we use to navigate through daily life. Imagine if you had to reach every decision, from which sandwich to buy at lunch to whether to smile back at that fetching stranger on the subway, through deep analysis and anguished navel-gazing. Life would be unbearable. System 1 saves us the trouble.
By contrast, System 2 is slow and deliberate. It is the conscious thinking we do when asked to calculate 23 times 16 or analyse the possible side effects of a new social policy. It involves planning, critical analysis and rational thought, and is driven by parts of the brain that continue to develop after birth and into adolescence, which is why children are all about instant gratification. Not surprisingly, System 2 consumes more energy.
System 1 was a good match for life in the distant past. Our early ancestors had less need to ruminate deeply or take the long view. They ate when hungry, drank when thirsty and slept when tired. ‘There was no tomorrow when living on the savannah, and survival depended on what you did each day,’ says Whybrow. ‘So the physiological systems that we inherited in the brain and body focused on finding short-term solutions and rewarding us for pursuing them.’ After farming began to take hold 10,000 years ago, planning for the future became an asset. Now, in a complex, post-industrial world, System 2 should be king.
Only it is not. Why? One reason is that, inside our 21st-century heads, we are still roaming the savannah. System 1 holds sway because it takes a lot less time and effort. When it kicks in, the brain floods with reward chemicals like dopamine, which deliver the kind of feel-good jolt that keeps us coming back for more. That’s why you get a little thrill every time you graduate to the next level in Angry Birds or cross an item off your To-Do list: job done, reward delivered, move on to the next thrill. In the cost–benefit calculus of neuroscience, System 1 offers maximum return for minimum effort. The rush it delivers can even become an end in itself. Like coffee addicts itching for a shot of caffeine, or smokers dashing outside for a cigarette, we get hooked on the quick fix of the quick fix. By comparison, System 2 can seem a dour taskmaster, demanding toil and sacrifice today in return for the promise of some vague pay-off in the future, like a personal trainer barking at us to eschew that chocolate éclair in favour of another 20 push-ups, or a parent nagging us to hit the books instead of running outside to play. Henry T. Ford was referring to System 2 when he said, ‘Thinking is the hardest work there is, which is the probable reason why so few engage in it.’
System 2 can also act like a spin doctor, rationalising our preference for short-term rewards. After yielding to temptation and wolfing down that éclair, we convince ourselves that we deserved a treat, needed the energy boost or will burn off the extra calories in the gym. ‘The bottom line is that the primitive brain is wired for the quick fix; it always has been,’ says Whybrow. ‘The delayed gratification that comes with taking the long view is hard work. The quick fix comes more naturally to us. That’s where we get our pleasure. We enjoy it and soon we want it quicker and quicker.’
That is why our ancestors warned against quick fixes long before Toyota invented the Andon rope. In the Bible, Peter urges Christians to be patient: ‘The Lord is not slow to fulfil his promise as some count slowness, but is patient towards you, not wishing that any should perish, but that all should reach repentance.’ Translation: God is not in the business of supplying real-time solutions. Nor was it just religious authorities that fretted over man’s soft spot for the siren call of short-termism. John Locke, a leading thinker of the Enlightenment, warned that quick-fix merchants were on the road to ruin. ‘He that has not mastery over his inclinations, he that knows not how to resist the importunity of present pleasure or pain, for the sake of what reason tells him is fit to be done, wants the true principle of virtue and industry, and is in danger never to be good at anything,’ he wrote. A century later, Alexander Hamilton, one of the founding fathers of the United States of America, restated the danger: ‘Momentary passions and immediate interests have a more active and imperious control over human conduct than general or remote considerations of policy, utility or justice.’ A distrust of snap decisions lingers even in the modern era. In the face of a dire medical diagnosis, the conventional advice is to seek a second opinion. Governments, businesses and other organisations spend billions gathering the data, research and analysis to help them solve problems thoroughly.
So, why, despite all these warnings and exhortations, do we still fall for the quick fix? The lure of System 1 is only part of the explanation. Over hundreds of thousands of years, the human brain has evolved a whole array of quirks and mechanisms that distort our thinking and nudge us in the same direction.
Consider our natural penchant for optimism. Across cultures and ages, research has shown that most of us expect the future to be better than it ends up being. We significantly underestimate our chances of being laid off, divorced or diagnosed with a fatal illness. We expect to sire gifted children, outperform our peers and live longer than we actually do. To paraphrase Samuel Johnson, we let hope triumph over experience. This tendency may have an evolutionary purpose, spurring us to strive and push forward, rather than retreat to a dark corner to brood on the unfairness of it all. In The Optimism Bias, Tali Sharot argues that belief in a better future fosters healthier minds in healthier bodies. Yet she warns that too much optimism can backfire. After all, who needs regular health check-ups or a retirement savings plan if everything is going to pan out in the end? ‘“Smoking kills” messages don’t work because people think their chances of cancer are low,’ says Sharot. ‘The divorce rate is 50 per cent, but people don’t think it’s the same for them. There is a very fundamental bias in the brain.’ And that bias affects the way we tackle problems. When you slip on the rose-tinted spectacles, the easy quick fix suddenly looks a whole lot more plausible.
The human brain also has a natural fondness for familiar solutions. Instead of taking the time to understand a problem on its own merits, our habit is to reach for fixes that have worked on similar problems in the past, even when better options are staring us in the face. This bias, uncovered in study after study, is known as the Einstellung effect. It was useful back in the days when mankind faced a limited set of urgent and straightforward problems such as how to avoid being eaten by a lion; it is less helpful in a modern world of spiralling complexity. The Einstellung effect is one reason we often make the same mistakes over and over again in politics, relationships and careers.
Another is our aversion to change. Conservatives do not have a monopoly on wanting to keep things as they are. Even when confronted with compelling arguments for a fresh start, the human instinct is to stay put. That’s why we can read a self-help book, nod in agreement all the way through, and then fail to put any of the advice into practice. Psychologists call this inertia the ‘status-quo bias’. It explains why we always sit in the same place in a classroom when there is no seating plan or stick with the same bank, pension provider and utility company when rivals offer better deals. This resistance to change is woven into our vernacular. ‘If it ain’t broke, don’t fix it,’ we say, or, ‘You can’t teach an old dog new tricks.’ Along with the Einstellung effect, the status-quo bias makes it harder for us to break out of a quick-fix rut.
Combine that with our reluctance to admit mistakes and you end up with another obstacle to the Slow Fix: the so-called ‘legacy problem’. The more we invest in a solution – staff, technology, marketing, reputation – the less inclined we are to question it or search for something better. That means we would rather stand by a fix that is not working than start looking for one that does. Even the nimblest problem-solvers can fall into this trap. In the early 2000s a trio of software whizzes in Estonia wrote some code that made it easy to make telephone calls over the Internet. Result: the birth of one of the fastest-growing companies of the 21st century. A decade later the Skype headquarters in Tallinn, the capital of Estonia, remains a shrine to start-up chic, with bare brick walls, bean bags and funky art. Everywhere you look, multinational hipsters are sipping mineral water or fiddling with iPads. On a landing near the room where I meet Andres Kütt, Skype’s young, goateed business evangelist, stands a whiteboard covered in squiggles from the last brainstorming session.
Even in this iconoclastic bear pit, the wrong fix can win stubborn defenders. At 36, Kütt is already a seasoned problem-solver. He helped pioneer Internet banking and spearheaded efforts to get Estonians to file their tax returns online. He worries that, by growing old enough and big enough to have vested interests, Skype has lost some of its problem-solving mojo. ‘Legacy is now a big problem for us, too,’ he says. ‘You make a massive investment to solve a problem and suddenly the problem is surrounded by a huge number of people and systems that want to justify their existence. You end up with a scenario where the original source of the problem is hidden and hard to reach.’ Rather than change tack, people in those circumstances usually plough on with the prevailing fix. ‘It is scary to step back and deal with the idea that your old solutions may not even work, and to contemplate investing time, money and energy in finding better ones,’ says Kütt. ‘It’s so much easier and safer to stay in your comfort zone.’
Clinging to a sinking ship may be irrational, but the truth is we are not as rational as we like to imagine. Study after study shows that we assume people with deeper voices (usually men) are cleverer and more trustworthy than those who speak in a higher register (usually women). We also tend to think good-looking folk are smarter and more competent than they really are. Or consider the Side Salad Illusion. In one study carried out at the Kellogg School of Management, people were asked to estimate the number of calories in unhealthy foods, such as bacon-and-cheese waffles. They then guessed the caloric content of those same foods when paired with a healthy side dish, such as a bowl of carrot and celery sticks. Time and again, people concluded that adding a virtuous accompaniment made the whole meal contain fewer calories, as if the healthy food could somehow make the unhealthy food less fattening. And this halo effect was three times more pronounced among avid dieters. The conclusion of Alexander Chernev, the lead researcher: ‘People often behave in a way that is illogical and ultimately counterproductive to their goals.’
You can say that again. Our gift for tunnel vision can seem limitless. When confronted by awkward facts that challenge our favoured view – proof that our quick fix is not working, for instance – we tend to write them off as a rogue result, or as evidence that ‘the exception proves the rule’. This is known as the confirmation bias. Sigmund Freud called it ‘denial’, and it goes hand in hand with the legacy problem and the status-quo bias. It can generate a powerful reality distortion field. When told by doctors they are going to die, many people block out the news entirely. Sometimes we cling to our beliefs even in the face of slam-dunk evidence to the contrary. Look at the cottage industry in Holocaust denial. Or how, in the late 1990s, Thabo Mbeki, then the president of South Africa, refused to accept the scientific consensus that AIDS was caused by the HIV virus, leading to the death of more than 330,000 people.
Even when we have no vested interest in distorting or filtering out information, we are still prone to tunnel vision. In an experiment repeated dozens of times on YouTube, test subjects are asked to count the number of passes made by one of two teams playing basketball together in a video. Because both sides have a ball, and the players are constantly weaving in and out around one another, this demands real concentration. Often, that sort of focus is useful, allowing us to block out the distractions that militate against deep thinking. But sometimes it can narrow the lens so we miss valuable bits of information and fail to see the forest for the trees. Halfway through the video a man dressed in a gorilla suit wanders into the middle of the basketball game, turns towards the camera, beats his chest, and walks out again. Guess how many people fail to spot the gorilla? More than half.
What all this underlines is an alarming truth: the human brain is chronically unreliable. The optimism, status-quo and confirmation biases; the lure of System 1; the Einstellung effect, denial and the legacy problem – sometimes it seems as if embracing the quick fix is our biological destiny. Yet neurological wiring is only part of the story. We have also built a roadrunner culture that steers us into Quick Fix Avenue.
These days, hurry is our answer to every problem. We walk fast, talk fast, read fast, eat fast, make love fast, think fast. This is the age of speed yoga and one-minute bedtime stories, of ‘just in time’ this and ‘on demand’ that. Surrounded by gadgets that perform minor miracles at the click of a mouse or the tap of a screen, we come to expect everything to happen at the speed of software. Even our most sacred rituals are under pressure to streamline, accelerate, get up to speed. Churches in the United States have experimented with drive-thru funerals. Recently the Vatican was forced to warn Catholics they could not gain absolution by confessing their sins through a smartphone app. Even our recreational drugs of choice nudge us into quick fix mode: alcohol, amphetamines and cocaine all shift the brain into System 1 gear.
The economy ramps up the pressure for quick fixes. Capitalism has rewarded speed since long before high-frequency trading. The faster investors turn a profit, the faster they can reinvest to make even more money. Any fix that keeps the cash flowing, or the share price buoyant, stands a good chance of carrying the day – because there is money to be made right now and someone else can clean up the mess later. That mindset has sharpened over the last two decades. Many companies spend more time fretting over what their stock prices are doing today than over what will make them stronger a year from now. With so many of us working on short-term contracts, and hopping from job to job, the pressure to make an instant impact or tackle problems with little regard for the long term is immense. This is especially true in the boardroom, where the average tenure for a global CEO has fallen sharply in recent years. In 2011, Leo Apotheker was fired as the boss of Hewlett-Packard after less than 11 months in the post. Dominic Barton, the managing director of McKinsey and Company, a leading consulting firm, hears the same lament from chief executives around the world: we no longer have enough time or incentive to look beyond the next quick fix. His verdict: ‘Capitalism has become too short-term.’
Modern office culture tends to reinforce that narrowing of horizons. When did you last have the time to take a long, hard look at a problem at work? Or even just to think deeply for a few minutes? Never mind tackling the big questions, such as where you want to be five years from now or how you might redesign your workplace from the bottom up. Most of us are too distracted by a never-ending blizzard of trivial tasks: a document to sign, a meeting to attend, a phone call to answer. Surveys suggest business professionals now spend half their working hours simply managing their email and social media inboxes. Day after day, week after week, the immediate trumps the important.
Politics is also steeped in the quick fix. Elected officials have every incentive to favour policies that will bear fruit in time for the next election. A cabinet minister may need results before the next reshuffle. Some analysts argue that each US administration enjoys only six months – that window between the Senate’s confirming its staff and the start of electioneering for the mid-term elections – when it can look beyond the daily headlines and polling numbers to concentrate on strategic decisions over the long term. Nor does it help that we tend to favour decisive, shoot-from-the-hip leadership. We love the idea of a lone hero riding into town with a ready-made solution in his saddle bag. How many figures have ever won power by declaring ‘It will take me a long time to work out how to solve our problems?’ Slowing down to reflect, analyse or consult can seem indulgent or weak, especially in moments of crisis. Or as one critic of the more cerebral Barack Obama put it: ‘We need a leader, not a reader.’ Daniel Kahneman, author of Thinking, Fast and Slow and only the second psychologist ever to win the Nobel Prize for Economics, believes our natural preference for politicians who follow their gut turns democratic politics into a carousel of quick fixes. ‘The public likes fast decisions,’ he says, ‘and that encourages leaders to go with their worst intuitions.’
Nowadays, though, it is no longer just politicians and business chiefs that believe they can wave a magic wand. We’re all at it in this age of bullshit, bluster and blarney. Look at the parade of tone-deaf wannabes vowing to be the next Michael Jackson or Lady Gaga on The X Factor. With so much pressure to stand out, we embellish our CVs, post flattering photos on Facebook and holler for attention on blogs and Twitter. A recent study found that 86 percent of 11-year-olds use social media to build their ‘personal brand’ online. Some of this chest-thumping may win friends and influence people, but it can also drive us into the arms of the quick fix. Why? Because we end up lacking the humility to admit that we do not have all the answers, that we need time and a helping hand.
The self-help industry must take some of the blame for this. After years of reading and writing about personal development, Tom Butler-Bowdon fell out of love with his own field. Too many motivational gurus, he decided, hoodwink the public with short cuts and quick fixes that do not really work. As a riposte, he published Never Too Late to Be Great, which shows how the best solutions in every field, from the arts to business to science, usually have a long gestation period. ‘By glossing over the fact that it takes time to produce anything of quality, the self-help industry has bred a generation of people that expect to fix everything tomorrow,’ he says.
The media add fuel to that fire. When anything goes wrong – in politics, business, a celebrity relationship – journalists pounce, dissecting the crisis with glee and demanding an instant remedy. After the golfer Tiger Woods was outed as a serial philanderer, he vanished from the public eye for three months before finally breaking his silence to issue a mea culpa and announce he was in therapy for sex addiction. How did the media react to being made to wait that long? With fury and indignation. The worst sin for a public figure on the ropes is to fail to serve up an instant exit strategy.
That impatience fuels a tendency to overhype fixes that later turn out to be complete turkeys. An engineer by training, Marco Petruzzi worked as a globetrotting management consultant for 15 years before abandoning the corporate world to build better schools for the poor in the United States. We will meet him again later in the book, but for now consider his attack on our culture of hot air. ‘In the past, hard-working entrepreneurs developed amazing stuff over time, and they did it, they didn’t just talk about it, they did it,’ he says. ‘We live in a world now where talk is cheap and bold ideas can create massive wealth without ever having to deliver. There are multi-billionaires out there who never did anything but capture the investment cycle and the spin cycle at the right moment, which just reinforces a culture where people don’t want to put in the time and effort to come up with real and lasting solutions to problems. Because if they play their cards right, and don’t worry about the future, they can get instant financial returns.’
From most angles, then, the quick fix looks unassailable. Everything from the wiring of our brains to the ways of the world seems to favour band-aid solutions. Yet all is not lost. There is hope. Wherever you go in the world today, and in every walk of life, more people are turning away from the quick fix to find better ways to solve problems. Some are toiling below the radar, others are making headlines, but all share one thing in common: a hunger to forge solutions that actually work.
The good news is the world is full of Slow Fixes. You just have to take the time to find and learn from them.
CHAPTER TWO (#u16c9237b-cea7-59fa-9605-d4409b980693)
CONFESS: The Magic of Mistakes and the Mea Culpa
Success does not consist in never making mistakes but in never making the same one a second time.
George Bernard Shaw
On a crisp night in early September, four Typhoon fighter jets roared across the sky above the freezing waters of the North Sea. Locked in a two-on-two dogfight, they swooped, banked and sliced through the darkness at up to 500 miles per hour, searching for a kill-shot. It was a training exercise, but to the pilots it all seemed very real. Strapped into his cockpit, with 24,000 pounds of killing machine throbbing at his fingertips, Wing Commander Dicky Patounas was feeling the adrenaline. It was his first night-time tactical sortie in one of the most powerful fighter jets ever built.
‘We’re in lights off because we’re doing this for real, which we don’t do very often, so it’s pitch black and I’m on goggles and instruments only,’ Patounas recalls. ‘I’m working the radar, putting it in the right mode by shortening the range, changing the elevation, all basic stuff. But the plane was new to me, so I’m maxed out.’ And then something went wrong.
A few months later Patounas relives that night back on the ground. His air base, RAF Coningsby, is in Lincolnshire, an eastern county of England whose flat, featureless terrain is prized more by aviators than by tourists. Dressed in a green flight suit festooned with zippers, Patounas looks like a Top Gun pilot from central casting – square jaw, broad shoulders, ramrod posture and cropped hair. He whips out pen and paper to illustrate what happened next on that September night, speaking in the clipped tones of the British military.
Patounas was flying behind the two ‘enemy’ Typhoons when he decided to execute a manoeuvre known as the overshoot to a Phase 3 Visual Identification (VID). He would pull out to the left and then slingshot back onto his original course, popping up right behind the trailing enemy plane. But something unforeseen happened. Instead of holding their course, the two rival jets up ahead banked left to avoid a helicopter 20 miles away. Both pilots announced the change on the radio but Patounas failed to hear it because he was too distracted executing his manoeuvre. ‘It’s all quite technical,’ he says. ‘You’ve got to do 60 degrees angle of bank through 60 degrees and then roll out for 20 seconds, then put your scanner down by 4 degrees, then change your radar to 10-mile scale, and after 20 seconds you come right using 45 degrees angle of bank, you go through 120 degrees, you roll out and pick up the guy on your radar and he should be at about 4 miles. So I’m working all this out and I miss the radio call stating the new heading.’
When Patounas rolled back out of the manoeuvre, he spotted an enemy Typhoon in front of him just as expected. He was pumped. ‘This aircraft now appears under my cross where I put it for the guy to appear, so I think I’ve done the perfect overshoot,’ he says. ‘I’ve set my radar up, pitched back in and the guy I’m looking for is under my cross in the pitch black. And I go, “I’m a genius, I’m good at this shit.” I was literally thinking I’ve never flown one so perfectly.’
He shakes his head and laughs wryly at his own hubris: it turned out the wrong Typhoon was in his crosshairs. Instead of ending up behind the trailing jet, Patounas was following in the slipstream of the frontrunner – and he had no idea. ‘It was my mistake: I basically lost awareness of two of the aircraft,’ he says. ‘I knew they were there but I didn’t ensure I could see two tracks. What I should have done was bump the range scale up and have a look for the other guy, but I didn’t because I said to myself, “This is perfect.”’
The result was that Patounas passed within 3,000 feet of the rear Typhoon. ‘It wasn’t that close but the key is I had no awareness, because I didn’t even know he was there,’ he says. ‘It could have been three feet, or I could have flown right into him.’ Patouanas falls quiet for a moment, as if picturing the worst-case scenario. On that September night his wingman watched the whole fiasco unfold, knew there was no real danger of a collision and allowed the exercise to continue, but a similar mistake in real combat could have been catastrophic – and Patounas knew it.
The rule of thumb in civil aviation is that a typical air accident is the result of seven human errors. Each mistake on its own may be harmless, even trivial, but string them together and the net effect can be lethal. Flying modern fighter jets, with their fiendishly complex computer systems, is an especially risky business. While enforcing the no-fly zone over Libya in 2011, a US F-15E crashed outside Benghazi after a mechanical failure. A month earlier, two F-16s from the Royal Thai air force fell from the sky during a routine training exercise.
What was surprising about the Typhoon incident over the North Sea was not that it happened but how Patounas reacted: he told everyone about his mistake. In the macho world of the fighter pilot, mea culpas are thin on the ground. As a 22-year veteran of the RAF and commander of a squadron of 18 Typhoon pilots, Patounas had a lot to lose yet still gathered together his entire crew and owned up. ‘I could have come away from this and not said anything, but the right thing to do was to raise it, put it into my report and get it in the system,’ he says. ‘I briefed the whole squadron on how I make mistakes and the mistake I made. That way people know I’m happy to put my hand up and say I messed up too, I’m human.’
This brings us to the first ingredient of the Slow Fix: admitting when we are wrong in order to learn from the error. That means taking the blame for serious blunders as well as the small mistakes and near misses, which are often warning signs of bigger trouble ahead.
Yet highlighting errors is much harder than it sounds. Why? Because there is nothing we like less than owning up to our mistakes. As social animals, we put a high premium on status. We like to fare bella figura, as the Italians say, or look good in front of our peers – and nothing ruins a nice figura more than screwing something up.
That is why passing the buck is an art form in the workplace. My first boss once gave me a piece of advice: ‘Remember that success has many fathers but failure is an orphan.’ Just look at your own CV – how many of your mistakes from previous jobs are listed there? On The Apprentice, most boardroom showdowns involve contestants pinning their own blunders on rivals. Even when big money is at stake, companies often choose to bury their heads in the sand rather than confront errors. Nearly half of financial services firms do not step in to rescue a floundering project until it has missed its deadline or run over budget. Another 15 per cent lack a formal mechanism to deal with a project’s failure.
Nor does it help that society often punishes us for embracing the mea culpa. In a hyper-competitive world, rivals pounce on the smallest error, or the tiniest whiff of doubt, as a sign of weakness. Though Japanese business chiefs and politicians sometimes bow and beg for forgiveness, their counterparts elsewhere bend both language and credibility to avoid squarely owning up to a mistake. In English, the word ‘problem’ has been virtually excised from everyday speech in favour of anodyne euphemisms such as ‘issue’ and ‘challenge’. Hardly a surprise when studies show that executives who conceal bad news from the boss tend to climb the corporate ladder more quickly.
In his retirement, Bill Clinton makes it a rule to say ‘I was wrong’ or ‘I didn’t know that’ at least once a day. If such a moment fails to arise naturally, he goes out of his way to engineer one. He does this to short-circuit the Einstellung effect and all those other biases we encountered earlier. Clinton knows the only way to solve problems in a complex, ever-changing world is to keep an open mind – and the only way to do that is to embrace your own fallibility. But can you imagine him uttering those phrases while he was President of the United States? Not a chance. We expect our leaders to radiate the conviction and certainty that come from having all the answers. Changing direction, or your mind, is never taken as proof of the ability to learn and adapt; it is derided as flip-flopping or wimping out. If President Clinton had confessed to making mistakes, or entertaining doubts about his own policies, his political enemies and the media would have ripped him to pieces.
The threat of litigation is another incentive to shy away from a proper mea culpa. Insurance companies advise clients never to admit blame at the scene of a traffic accident, even if the crash was clearly their fault. Remember how long it took BP to issue anything resembling an official apology for the Deepwater Horizon oil spill? Nearly two months. Behind the scenes, lawyers and PR gurus pored over legal precedents to fashion a statement that would appease public opinion without opening the door to an avalanche of lawsuits. Nor is it just companies that shrink from accepting blame. Even after they leave office and no longer need to woo the electorate, politicians find it hard to own up to their errors. Neither Tony Blair nor George W. Bush has properly apologised for invading Iraq in search of weapons of mass destruction that did not exist. Remove individual ego from the equation, and collectively we still shy away from mea culpas. Britain waited nearly four decades to issue a formal apology for the Bloody Sunday massacre in Northern Ireland in 1972. Australia only apologised in 2008 for the horrors visited upon its aboriginal peoples, followed a year later by the US Senate apologising to African-Americans for the wrongs of slavery.
Even when there are no witnesses to our slip-ups, admitting we are wrong can be wrenching. ‘Nothing is more intolerable,’ Ludwig van Beethoven noted, ‘than to have to admit to yourself your own errors.’ Doing so forces you to confront your frailties and limitations, to rethink who you are and your place in the world. When you mess up, and admit it to yourself, there is nowhere to hide. ‘This is the thing about fully experiencing wrongness,’ wrote Kathryn Schulz in her book Being Wrong. ‘It strips us of all our theories, including our theories about ourselves … it leaves us feeling flayed, laid bare to the bone and the world.’ Sorry really is the hardest word.
This is a shame, because mistakes are a useful part of life. To err is human, as the saying goes. Error can help us solve problems by showing us the world from fresh angles. In Mandarin, the word ‘crisis’ is rendered with two characters, one signifying ‘danger’, the other ‘opportunity’. In other words, every screw-up holds within it the promise of something better – if only we take the time to acknowledge and learn from it. Artists have known this for centuries. ‘Mistakes are almost always of a sacred nature,’ observed Salvador Dalí. ‘Never try to correct them. On the contrary: rationalise them, understand them thoroughly. After that, it will be possible for you to sublimate them.’
That same spirit reigns in the more rigorous world of science, where even a failed experiment can yield rich insights and open new paths of inquiry. Many world-changing inventions occurred when someone chose to explore – rather than cover up – an error. In 1928, before leaving to spend August with his family, Sir Alexander Fleming accidentally left a petri dish containing staphylococcus bacteria uncovered in his basement laboratory in London. When he returned a month later he found a fungus had contaminated the sample, killing off all the surrounding bacteria. Rather than toss the dish in the bin, he analysed the patch of mould and found it contained a powerful infection-fighting agent. He named it Penicillium notatum. Two decades later, penicillin, the world’s first and still most widely used antibiotic, hit the market, revolutionising healthcare and earning Fleming a Nobel prize in Medicine. ‘Anyone who has never made a mistake,’ said Einstein, ‘has never tried anything new.’
Military folk have always known that owning up to mistakes is an essential part of learning and solving problems. Errors cost lives in the air force, so flight safety has usually taken precedence over fare bella figura. In the RAF’s long-running monthly magazine, Air Clues, pilots and engineers write columns about mistakes made and lessons learned. Crews are also fêted for solving problems. In a recent issue, a smiling corporal from air traffic control received a Flight Safety Award for overruling a pilot and aborting a flight after noticing a wingtip touch the ground during take-off.
In the RAF, as in most air forces around the world, fighter pilots conduct no-holds-barred debriefings after every sortie to examine what went right and wrong. But that never went far enough. RAF crews tended to share their mistakes only with mates rather than with their superiors or rival squadrons. As one senior officer says: ‘A lot of valuable experience that could have made flying safer for everyone was just seeping away through the cracks.’
To address this, the RAF hired Baines Simmons, a consulting firm with a track record in civil aviation, to devise a system to catch and learn from mistakes, just as the transportation, mining, food and drug safety industries have done.
Group Captain Simon Brailsford currently oversees the new regime. After joining the RAF as an 18-year-old, he went on to fly C130 Hercules transport planes as a navigator in Bosnia, Kosovo, northern Iraq and Afghanistan. Now 46, he combines the spit-and-polish briskness of the officers’ mess with the easy charm of a man who spent three years as the Equerry to Her Majesty Queen Elizabeth II.
On the whiteboard in his office he uses a red felt-tip pen to sketch me a picture of a crashed jet, a dead pilot and a plume of smoke. ‘Aviation is a dangerous business,’ he says. ‘What we’re trying to do is stop picking up the deceased and the bits of the broken aeroplane on the ground and pull the whole story back to find out the errors and the near misses that can lead to the crash, so the crash never happens in the first place. We want to solve issues before they become problems.’
Every time crew members at RAF Coningsby catch themselves doing something that could jeopardise safety, they are now urged to submit a report online or fill in one of the special forms pinned up in work stations all over the base. Those reports are then funnelled to a central office, which decides whether to investigate further.
To make the system work, the RAF tries to create what it calls a ‘just culture’. When someone makes a mistake, the automatic response is not blame and punishment; it is to explore what went wrong in order to fix and learn from it. ‘People must feel that if they tell you something, they’re not going to get into trouble, otherwise they won’t tell you when things go wrong, and they might even try to cover them up,’ says Brailsford. ‘That doesn’t mean they won’t get told off or face administrative action or get sent for extra training, but it means they’ll be treated in a just manner befitting what happened to them, taking into account the full context. If you make a genuine mistake and put up your hand, we will say thank you. The key is making sure everyone understands that we’re after people sharing their errors rather than keeping it to themselves so that we’re saving them and their buddies from serious accidents.’
RAF Coningsby rams home that message at every turn. All around the base, in hallways, canteens and even above the urinals, posters urge crew to flag even the tiniest safety concern. Toilet cubicles are stuffed with laminated brochures explaining how to stay safe and why even the smallest mishap is worth reporting. Hammered into the ground beside the main entrance is a poster bearing a photo of the Station Flight Safety Officer pointing his finger in the classic Lord Kitchener pose. Printed above his office telephone number is the question: ‘So what did you think of today?’ The need to admit mistakes is also baked into cadets at military academy. ‘It’s definitely drilled into us from the start that “we prefer you mess up and let us know”,’ says one young engineer at RAF Coningsby. ‘Of course, you get a lot of stick and banter from your mates for making mistakes, but we all understand that owning up is the best way to solve problems now and in the future.’
The RAF ensures that crew see the fruits of their mea culpas. Safety investigators telephone all those who flag up problems within 24 hours, and later tell them how the case was concluded. They also conduct weekly workshops with engineers to explain the outcome of all investigations and why people were dealt with as they were. ‘You can see their eyebrows go up when it’s clear they won’t be punished for making a mistake and they might actually get a pat on the back,’ says one investigator.
Group Captain Stephanie Simpson, a 17-year veteran of the RAF, is in charge of safety in the engineering division at Coningsby. She has quick, watchful eyes and wears her hair scraped back in a tight bun. She tells me the new regime paid off recently when an engineer noticed that carrying out a routine test on a Typhoon had sheared off the end of a dowel in the canopy mechanism. A damaged canopy might not open, meaning a pilot trying to jettison from the cockpit would be mashed against the glass.
The engineer filed a report and Simpson’s team swung into action. Within 24 hours they had figured out that an elementary mistake during the canopy test could damage the dowel. There was no requirement to go back and check afterwards. Flight crews immediately inspected the suspect part across the entire fleet of Typhoons in Europe and Saudi Arabia. The procedure was then changed to ensure that the dowel is no longer damaged during the test.
‘Ten years ago this would probably never have been reported – the engineers would have just thought, “Oh, that’s broken, we’ll just quietly replace it,” and then carried on,’ says Simpson. ‘Now we’re creating a culture where everyone is thinking, “Gosh, there could be other aircraft on this station with the same problem that might not be spotted in future so I’d better tell someone right now.” That way you stop a small problem becoming a big one.’
Thanks to Patounas’s candour, an RAF investigation discovered that a series of errors led to the near miss above the North Sea. His own failure to hear the order to bank left was the first. The second was that the other pilots changed course even though he did not acknowledge the fresh heading. Then, after Patounas overshot, the whole team failed to switch on their lights. ‘It turned out a whole set of factors were not followed and if anyone had done one of the things they should have, it wouldn’t have happened,’ says Patounas. ‘The upside is this reminds everyone of the rules for doing a Phase 3 VID at night. So next time we won’t have the same issue.’
Others in his squadron are already following his lead. Days before my visit, a young corporal pointed out that certain procedures were not being properly followed. ‘What she said was not a particularly good read, but that’s going in her report as a positive because she had the courage of her convictions to go against the grain when she could have been punished,’ says Patounas. ‘Twenty years ago, she wouldn’t have raised the question or if she had she’d have been told, “Don’t you say how rubbish my squadron is! I want my dirty laundry kept to me,” whereas I’m saying thank you.’
The RAF is not a paragon of problem-solving. Not every mistake or near miss is reported. Similar cases are not always dealt with in the same manner, which can undermine talk of a ‘just culture’. Some officers remain sceptical about persuading pilots and engineers to accept the virtues of airing all their dirty laundry. Many of the mea culpa columns in Air Clues magazine are still published anonymously. ‘Sorry’ remains a hard word to say in the RAF.
Yet the change is paying off. In the first three years of the new regime, 210 near misses or errors were reported at RAF Coningsby. Of these, 73 triggered an investigation. In each one, steps were taken to make sure the mistake never happened again. ‘Given that we never reported near misses before, that’s a quantum shift, a big leap of faith in people,’ says Brailsford. ‘Instead of putting a plaster over problems, we’re now going deeper and dealing with them at their root. We’re nipping problems in the bud by stopping them before they even happen.’ Other air forces, from Israel to Australia, have taken notice.
Adding the mea culpa to your problem-solving toolbox pays off beyond the military. Take ExxonMobil. After the epic Exxon Valdez oil spill off the coast of Alaska in 1989, the company set out to catch and investigate every screw-up, however small. It walked away from a large drilling project in the Gulf of Mexico because, unlike BP, it decided drilling there was too risky. Safety is now such a part of the corporate DNA that every buffet laid out for company events comes with signs warning not to consume the food after two hours. In its cafeterias, kitchen staff monitor the temperature of their salad dressings.
Every time an error occurs at an ExxonMobil facility, the first instinct of the company is to learn from it, rather than punish those involved. Staff talk about the ‘gift’ of the near miss. Glenn Murray, an employee for nearly three decades, was part of the Valdez clean-up. Today, as head of safety at the company, he believes no blunder is too small to ignore. ‘Every near miss,’ he says, ‘has something to teach us if we just take the time to investigate it.’
Like the RAF and Toyota, ExxonMobil encourages even the most junior employee to speak up when something goes wrong. Not long ago a young engineer new to the company was uneasy about a drilling project in West Africa – so he temporarily closed it down. ‘He shut down a multi-million dollar project because he felt there were potential problems and we needed to pause and think it all through, and management backed him,’ says Murray. ‘We even had him stand up at an event and named him Employee of the Quarter.’ By every yardstick, Exxon now has an enviable safety record in the oil industry.
Mistakes can also be a gift when dealing with consumers. Four out of every five products launched perish within the first year, and the best companies learn from their flops. The Newton MessagePad, the Pippin and the Macintosh Portable all bombed for Apple yet helped pave the way for winners like the iPad.
Even in the cut-throat world of brand management, where the slightest misstep can send customers stampeding for the exit and hobble the mightiest firm, owning up to mistakes can deliver a competitive edge. In 2009, with sales tanking in the United States, Domino’s Pizza invited customers to deliver their verdict on its food. The feedback was stinging. ‘Worst excuse for a pizza I’ve ever tasted,’ said one member of the public. ‘Totally devoid of flavour,’ said another. Many customers compared the company’s pizza crust to cardboard.
Rather than sulk, or sit on the results, Domino’s issued a full-blown mea culpa. In documentary-style television commercials, Patrick Doyle, the company’s CEO, admitted the chain had lost its way in the kitchen and promised to deliver better pizzas in the future. Domino’s then went back to the drawing board, giving its pies a complete makeover with new dough, sauce and cheese.
Its Pizza Turnaround campaign worked a treat. Year-on-year sales surged 14.3 per cent, the biggest jump in the history of the fast-food industry. Two years after the apology the company’s stock price was up 233 per cent. Of course, the new pizza recipes helped, but the starting-point was Domino’s doing what RAF air crews and Exxon employees are now expected to do as a matter of course: acknowledging the error of its ways. This allowed the firm to learn exactly where it was going wrong so it could fix it. It also cleared the air. These days, so many companies trumpet ‘new and improved’ products that the net effect is a whirlwind of white noise that leaves consumers cold. The very act of owning up to its mistakes allowed Domino’s to cut through the din and reboot its relationship with customers.
PR experts agree that the best way for a company to handle a mistake is to apologise and explain what it will do to put things right. This accords with my own experience. The other day a payment into my bank account went astray. After 20 minutes of evasion from the call centre, my voice began to rise as my blood reached boiling point. And then a manager came on the line and said: ‘Mr Honoré, I’m very sorry. We made a mistake with this payment.’ As she explained how the money would be retrieved, my fury drained away and we ended up bantering about the weather and our summer holidays.
Public apologies can have a similarly soothing effect. When a customer filmed a FedEx driver tossing a package containing a computer monitor over a six-foot fence in the run-up to Christmas 2011, the video went viral and threatened to annihilate sales during the busiest time of year. Rather than stonewall, though, the company apologised right away. In a blog post entitled ‘Absolutely, Positively Unacceptable’, FedEx’s senior vice-president for US operations announced he was ‘upset, embarrassed, and very sorry’ for the episode. The company also gave the customer a new monitor and disciplined the driver. As a result, FedEx weathered the storm.
Even when we squander other people’s money, owning up in order to learn from the error is often the best policy. In 2011, Engineers Without Borders (EWB) Canada set up a website called AdmittingFailure.com, where aid workers can post their mistakes as cautionary tales. ‘Opening up like that is completely the opposite of the norm in the sector, so it was a huge risk,’ says Ashley Good, Venture Leader at EWB. But it paid off. No longer afraid of being pilloried for messing up, EWB staff became more willing to take the sort of risks that are often the stepping stone to creative breakthroughs. ‘People now feel they have the freedom to experiment, push themselves, take chances because they know they won’t be blamed if they don’t get it right on the first try,’ says Good. ‘And when you push boundaries like that, you get more creative solutions to problems.’ One example: after much trial and error, EWB has devised a system that improves water and sanitation services in Malawi by mobilising district governments, the private sector and communities all at the same time. Workers from across the development sector now post their own stories on AdmittingFailure.com. EWB’s donors love the new regime, too. Instead of dashing for the exit, they welcomed the eagerness to learn from mistakes. Says Good: ‘We’ve found that being open and honest actually builds a stronger bond and higher trust with our donors.’
The same holds true in personal relationships. A first step towards rebuilding bridges after falling out with a partner, friend, parent or child is for all parties to take their share of the blame. Admitting mistakes can ease the guilt and shame gnawing at the wrongdoer and help the victim overcome the anger that often stands in the way of forgiveness. Marianne Bertrand sees the magic of the mea culpa every week in her job as a family therapist in Paris. ‘Many people sit in my office and cannot even begin to address their problems because they are stuck in the rage and resentment for what went wrong,’ she says. ‘But when they finally accept and apologise sincerely for their mistakes, and hear the other person doing the same, you can really feel the atmosphere in the room change, the tension subside, and then we can start working on reconciliation.’
Even doctors are warming to the mea culpa. Study after study shows that what many patients want after being the victim of a medical mistake is not a lump sum payment or the physician’s head on a plate. What they really crave is what FedEx delivered in the wake of that package-tossing incident: a sincere apology, a full explanation of how the error occurred and a clear plan to ensure the same thing will not happen again. Among patients who file a suit for medical malpractice in the United States, nearly 40 per cent say they might not have done so had the attending physician explained and apologised for the mishap. The trouble is, many in the medical profession are too proud or too scared to say sorry.
Those that do so reap the benefits. In the late 1980s the Department of Veterans Affairs Medical Center in Lexington, Kentucky became the first hospital in the United States to tap the power of the mea culpa. It informs patients and their families when any member of staff makes a mistake that causes harm, even if the victims are unaware of the error. If the attending physician is found to be at fault, he or she must deliver a clear, compassionate apology to the patient. The hospital also explains the steps it will take to ensure that the error does not happen again, and may offer some form of restitution. But the cornerstone of the new regime is the simple act of saying sorry. This scores well with patients and their families. ‘We believe we spend much less time and money on malpractice lawsuits these days as a result,’ says Joseph Pellecchia, the hospital’s Chief of Staff.
Apologising also helps deliver better healthcare. When medical workers can deal openly with the emotional fallout that comes from making a mistake, they are less stressed and more able to learn from their errors. ‘Physicians are not gods, they are human beings, and that means they make mistakes,’ says Pellecchia. ‘There’s been an incredible change here where we’ve gone from a punitive environment to a learning environment where a physician can ask, “What happened here?” “What went wrong?” “Was it a systems problem?” “Was it me?” – and learn from their mistakes to deliver better care.’ Other hospitals around the world have followed suit. In the same vein, state and provincial governments across the US and Canada have enacted what are known as ‘sorry laws’, which bar litigants from using a physician’s apology as proof of guilt. Everywhere the net effect is the same: happier doctors, happier patients and less litigation.
The truth is that any Slow Fix worthy of the name usually starts with a mea culpa. Whether at work or in relationships, most of us tend to drift along pretending that all is well – remember the status-quo bias and the legacy problem. Admitting there is a problem, and accepting our share of the blame, can jolt us out of that rut. In the Twelve-Step Programme invented by Alcoholic Anonymous and now used in the battle against many other addictions, Step 1 is to admit you have lost control of your own behaviour. ‘Hello, my name is Carl, and I am addicted to the quick fix.’
To overcome our natural aversion to admitting mistakes, especially in the workplace, removing the stick of punishment is often just the first step. It also helps to dangle a carrot to encourage or even reward us for owning up. Remember the Employee of the Quarter accolade bestowed on that young engineer at ExxonMobil. As well as Flight Safety Awards, the RAF pays a cash bonus to anyone who highlights an error that later saves the Air Force money. In the aid world, organisations can win Brilliant Failure Awards for sharing mistakes made in development projects. At SurePayroll, an online payroll company, staff nominate themselves for a Best New Mistakes competition. At a light-hearted annual meeting, they listen to tales of colleagues messing up and what everyone can learn from their blunders. Those who own up to the most useful mistakes win a cash prize.
Even in education, where botching a single question on an exam paper can torpedo your chances of attending a top-tier university, moves are afoot to reward students for embracing mistakes. Worried that its high-achieving pupils had lost their appetite for taking intellectual risks, a top London girls’ school held a Failure Week in 2012. With the help of teachers and parents, and through assemblies, tutorials and other activities, students at Wimbledon High explored the benefits of being wrong. ‘Successful people learn from failure, pick themselves up and move on,’ says Heather Hanbury, the headmistress. ‘Something going wrong may even have been the best thing that could have happened to them in the long run – in sparking creativity, for instance – even if it felt like a disaster at the time.’ Failure Week has altered the atmosphere in the school. Instead of mollycoddling pupils, teachers feel more comfortable telling them point-blank when they have given a wrong answer, thus making it easier to search for a better one. The girls are taking greater risks, too, pursuing more daring lines of inquiry in the classroom and entering creative writing competitions in larger numbers. Members of the school debating club are deploying more adventurous arguments and winning more competitions. ‘Maybe the most important thing the Week gave us is a language to talk about failure as something not to avoid but as an essential part of learning, improving and solving problems,’ says Hanbury. ‘If one girl is upset by a poor mark, another might now make a friendly joke about it or say something like, “OK, you failed, but what can you learn from it?”’
Most workplaces are in dire need of a similar cultural shift. Think of all the lessons that go unlearned, all the problems left to fester, all the bad feelings churned up, all the time, energy and money wasted, thanks to the human instinct to cover up mistakes. Now think of how much more efficient – not to mention agreeable – your workplace would be if every error could be a spur to working smarter. Instead of muddling along, you could revolutionise your office or factory from the bottom up.
There are steps we can all take to harness the mea culpa and learn from our mistakes. Schedule a daily Clinton moment when you say, ‘I was wrong’ – and then find out why. When you mess up at work, pinpoint one or two lessons to be gleaned from the mishap and then quickly own up. When others mess up, quell the temptation to scoff or gloat and instead help them to spot the silver lining. Start a conversation in your company, school or family about how admitting mistakes can inspire creative leaps. Reinforce that message by using feel-good terms such as ‘gift’ or ‘bonus’ to describe the uncovering of helpful errors and by pinning up quotes such as this from Henry T. Ford: ‘Failure is simply the opportunity to begin again, this time more intelligently.’
It also helps to create a shared space, such as a web forum or a suggestions book, for airing mistakes. Borrowing an idea from Toyota, Patounas has put up a Communications Board in his squadron headquarters where any crew member can call attention to a problem – and every case is promptly investigated and addressed. ‘It’s very popular already and you see the engineers and pilots gathered round it,’ says Patounas. ‘It’s tangible and something you can put your arms round.’
It certainly helps to know that our errors seldom look as bad to others as we imagine. We have a natural tendency to overestimate how much people notice or care about our gaffes. Psychologists call this the ‘spotlight effect’. You may feel mortified to discover you attended a big meeting with laddered tights or egg on your tie, but the chances are hardly anyone else noticed. In one study at Cornell University, students were asked to walk into a room wearing a Barry Manilow T-shirt, a social kiss of death for any self-respecting hipster. While the subjects nearly died of embarrassment, only 23 per cent of the people in the room even clocked the cheesy crooner.
If owning up to a mistake is seldom as bad as we fear, however, it is only the first step towards a Slow Fix. The next is taking the time to work out exactly how and why we erred in the first place.
CHAPTER THREE (#u16c9237b-cea7-59fa-9605-d4409b980693)
THINK HARD: Reculer Pour Mieux Sauter
Don’t just do something, stand there.
White Rabbit in Alice in Wonderland (Disney version)
If asked to design an office that could make staff look forward to Monday morning, you might come up with something like the headquarters of Norsafe. Every window looks onto a snapshot of bucolic bliss. Clapboard houses nestle in the forest, small boats bob alongside wooden piers, gulls float across a clear sky. In the late morning, the sunshine turns this narrow waterway in southern Norway into a strip of shimmering silver.
For many years the company’s balance sheet looked similarly idyllic. Norsafe has been building boats since 1903 in a country where boating is a serious business. With more coastline than the United States, this long, slender nation on the northern edge of Europe has always looked to the sea. Even today, one in seven Norwegians owns some sort of watercraft. But looks can be deceiving. Not so long ago Norsafe was a firm on the verge of a nervous breakdown, where nobody looked forward to coming in to work on Monday morning.
The company manufactures highly specialised lifeboats for oil rigs and supertankers. Enclosed like a submarine, and painted a vivid, regulation orange, they can drop into the sea, with a full load of passengers, from a height of nearly 40 metres. In the mid-2000s, as the global economy boomed, orders flooded in from around the world, tripling Norsafe’s turnover. Yet behind the top line numbers, the firm, like Toyota, had lost control of its inner workings and was struggling to keep up. Deadlines slipped, design faults passed unnoticed through the production plant, customer complaints went unanswered. With lawsuits piling up and profits plunging, the design, manufacturing and sales teams were at each other’s throats. Everyone knew there was a problem, but no one knew how to fix it.