Endure: Mind, Body and the Curiously Elastic Limits of Human Performance
ñêà÷àòü êíèãó áåñïëàòíî
If there was any remaining doubt about Hill’s vision of the “human machine,” the arrival of World War II in 1939 helped to erase it. As Allied soldiers, sailors, and airmen headed into battle around the world, scientists at Harvard and elsewhere studied the effects of heat, humidity, dehydration, starvation, altitude, and other stressors on their performance, and searched for practical ways of boosting endurance under these conditions. To assess subtle changes in physical capacity, researchers needed an objective measure of endurance—and Hill’s concept of VO2max fit the bill.
The most notorious of these wartime studies, at the University of Minnesota’s Laboratory of Physical Hygiene, involved thirty-six conscientious objectors—men who had refused on principle to serve in the armed forces but had volunteered instead for a grueling experiment. Led by Ancel Keys, the influential researcher who had developed the K-ration for soldiers and who went on to propose a link between dietary fat and heart disease, the Minnesota Starvation Study put the volunteers through six months of “semi-starvation,” eating on average 1,570 calories in two meals each day while working for 15 hours and walking 22 miles per week.
In previous VO2max studies, scientists had trusted that they could simply ask their subjects to run to exhaustion in order to produce maximal values. But with men who’ve been through the physical and psychological torment of months of starvation, “there is good reason for not trusting the subject’s willingness to push himself to the point at which a maximal oxygen intake is elicited,” Keys’s colleague Henry Longstreet Taylor drily noted. Taylor and two other scientists took on the task of developing a test protocol that “would eliminate both motivation and skill as limiting factors” in objectively assessing endurance. They settled on a treadmill test in which the grade got progressively steeper, with carefully controlled warm-up duration and room temperature. When subjects were tested and retested, even a year later, their results were remarkably stable: your VO2max was your VO2max, regardless of how you felt that day or whether you were giving your absolute best. Taylor’s description of this protocol, published in 1955, marked the real start of the VO2max era.
By the 1960s, growing faith in the scientific measurement of endurance led to a subtle reversal: instead of testing great athletes to learn about their physiology, scientists were using physiological testing to predict who could be a great athlete. South African researcher Cyril Wyndham argued that “men must have certain minimum physiological requirements if they are to reach, say, an Olympic final.” Rather than sending South African runners all the way across the world only to come up short, he suggested, they should first be tested in the lab so that “conclusions can be drawn on the question of whether the Republic’s top athletes have sufficient ‘horse-power’ to compete with the world’s best.”
In some ways, the man-as-machine view had now been pushed far beyond what Hill initially envisioned.“There is, of course, much more in athletics than sheer chemistry,” Hill had cheerfully acknowledged, noting the importance of “moral” factors—“those qualities of resolution and experience which enable one individual to ‘run himself out’ to a far greater degree of exhaustion than another.” But the urge to focus on the quantifiable at the expense of the seemingly abstract was understandably strong. Scientists gradually fine-tuned their models of endurance by incorporating other physiological traits like economy and “fractional utilization” along with VO2max—the equivalent of considering a car’s fuel economy and the size of its gas tank in addition to its raw horsepower.
It was in this context that Michael Joyner proposed his now-famous 1991 thought experiment on the fastest possible marathon. As a restless undergraduate in the late 1970s, Joyner had been on the verge of dropping out of the University of Arizona—at six-foot-five, and with physical endurance that eventually enabled him to run a 2:25 marathon, he figured he might make a pretty good firefighter—when he was outkicked at the end of a 10K race by a grad student from the school’s Exercise and Sport Science Laboratory. After the race, the student convinced Joyner to volunteer as a guinea pig in one of the lab’s ongoing experiments, a classic study that ended up demonstrating that lactate threshold, the fastest speed you can maintain without triggering a dramatic rise in blood lactate levels, is a remarkably accurate predictor of marathon time. The seed was planted and Joyner was soon volunteering at the lab and embarking on the first stages of an unexpected new career trajectory that eventually led to a position as physician-researcher at the Mayo Clinic, where he is now one of the world’s mostly widely cited experts on the limits of human performance.
That first study on lactate threshold offered Joyner a glimpse of physiology’s predictive power. The fact that such an arcane lab test could pick the winner, or at least the general gist of finishing order, among a group of endurance athletes was a tantalizing prospect. And when, a decade later, Joyner finally pushed this train of thought to its logical extreme, he arrived at a very specific number: 1:57:58. It was a ridiculous, laughable number—a provocation. Either the genetics needed to produce such a performance were exceedingly rare, he wrote in the paper’s conclusions, “or our level of knowledge about the determinants of human performance is inadequate.”
By Day 56, the relentless physical demands of Henry Worsley’s solo trans-Antarctic trek were taking a toll. He woke that morning feeling weaker than he’d felt at any point in the expedition, his strength sapped by a restless night repeatedly interrupted by a “bad stomach.” He set off as usual, but gave up after an hour and slept for the rest of the day. “You have to listen to your body sometimes,” he admitted in his audio diary.
Still, he was more than 200 miles from his destination and already behind his planned schedule. So he roused himself that night, packed up his tent, and set off again at ten minutes after midnight under the unblinking polar sun. He was approaching the high point of the journey, slogging up a massive ice ridge known as the Titan Dome, more than 10,000 feet above sea level. The thin air forced him to take frequent breaks to catch his breath, and a stretch of sandy, blowing snow bogged his sled down and slowed his progress for several hours. By 4 P.M., having covered 16 miles in 16 hours, he was once again utterly spent. He had hoped to cross from the 89th degree of southern latitude—the one closest to the South Pole—into the 88th, but he was forced to stop one mile short of his goal. “There was nothing left in the tank,” he reported. “I had completely run empty.”
The next day was January 9, the day that Shackleton had famously turned back from his South Pole quest in 1909. “A live donkey is better than a dead lion, isn’t it?” Shackleton had said to his wife when he returned to England. Worsley was camped just 34 miles from Shackleton’s turnaround latitude, and he marked the anniversary with a small cigar—which he chomped with a gap-toothed grin, having lost a front tooth to a frozen energy bar a few days earlier—and a dram of Dewar’s Royal Brackla Scotch whiskey, a bottle of which he had hauled across the continent.
Of the many advantages Worsley had over Shackleton, perhaps the most powerful was the Iridium satellite phone he carried in his pack, with which he could choose at any moment to call for an air evacuation. But this blessing was also a curse. In calculating his limits, Shackleton had been forced to leave a margin of error due to the impossibility of predicting how the return journey would go. Worsley’s access to near-instantaneous help, on the other hand, allowed him to push much closer to the margins—to empty his tank day after day, after struggling through the snow for 12, 14, or 16 hours; to ignore his increasing weakness and 50-pound weight loss; to fight on even as the odds tilted further against him.
Eventually, it became clear that he wouldn’t make it to his scheduled pickup. He’d been trying to log 16-hour days to get back on schedule, but soft snow and whiteouts combined with his continuing physical deterioration to derail him. He contemplated a shorter goal of reaching the Shackleton glacier, but even that proved out of reach. On January 21, his seventieth day of travel, he made the call. “When my hero Ernest Shackleton stood 97 [nautical] miles from the South Pole on the morning on January 9, 1909, he said he’d shot his bolt,” Worsley reported in his audio diary. “Well today, I have to inform you with some sadness that I too have shot my bolt. My journey is at an end. I have run out of time, physical endurance, and the simple sheer ability to slide one ski in front of the other.”
The next day, he was picked up for the six-hour flight back to Union Glacier, where logistical support for Antarctic expeditions is based, and then airlifted to the hospital in Punta Arenas, Chile, to be treated for exhaustion and dehydration. It was a disappointing end to the expedition, but Worsley appeared to have successfully followed Shackleton’s advice to remain a “live donkey.” In the hospital, though, the situation took an unexpected turn: Worsley was diagnosed with bacterial peritonitis, an infection of the abdominal lining, and rushed into surgery. On January 24, at the age of fifty-five, Henry Worsley died of widespread organ failure, leaving behind a wife and two children.
When avalanches claim a skier, or sharks attack a surfer, or a puff of unexpected wind dooms a wingsuit flier, it’s always news. Like these other “extreme” deaths, Worsley’s tragic end was reported and discussed around the world. There was a difference, though. There had been no avalanche, no large, hungry predator, no high-speed impact. He didn’t freeze to death, he wasn’t lost, and he still had plenty of food to eat. Though it may never be clear exactly what pushed him over the edge, he seemed, in essence, to have voluntarily driven himself to oblivion—a rarity that added a grim fascination to his demise. “In exploring the outer limits of endurance,” Britain’s Guardian newspaper asked, “did Worsley not realize he’d surpassed his own?”
In a sense, Worsley’s death seemed a vindication of the mathematical view of human limits. “The machinery of the body is all of a chemical or physical kind. It will all be expressed some day in physical and chemical terms,” Hill had predicted in 1927. And every machine, no matter how great, has a maximum capacity. Worsley, in trying to cross Antarctica on his own, had embarked on a mission that exceeded his body’s capacity, and no amount of mental strength and tenacity could change that calculation.
But if that’s true, then why is death by endurance so rare? Why don’t Olympic marathoners and Channel swimmers and Appalachian Trail hikers keel over on a regular basis? That’s the riddle a young South African doctor named Tim Noakes posed to himself as he was preparing to deliver the most important talk of his life, a prestigious honorary lecture at the annual meeting of the American College of Sports Medicine, in 1996: “I said, now hold on. What is really interesting about exercise is not that people die of, say, heatstroke; or when people are climbing Everest, it’s not that one or two die,” he later recalled. “The fact is, the majority don’t die—and that is much more interesting.”
To catch the ferry, Diane Van Deren needed to cover 36 miles in just over 8 hours. That would normally be no problem for the veteran ultra-runner—except, in this case, for the unforgiving terrain, the torrential rain and sumo-force winds left in the wake of Tropical Storm Beryl, and the fatigue and horrendous blisters accrued over the first 19 days and 900 miles of the Mountains-to-Sea Trail across North Carolina. Worse, Van Deren was startled to hear a “savage and malicious” roar from the darkness to her right. “What is that?” she yelled to her trail guide, Chuck Millsaps, the owner of a local outfitting company. It was just an airplane, he assured her—but to be safe, they strapped themselves together for mutual safety as they prepared to cross a wind-whipped bridge.
At stake in all the chaos was Van Deren’s attempt to set a new record for the 1,000-mile trail: if they missed the 1 P.M. ferry from Cedar Island to Ocracoke, the mark of 24 days, 3 hours, and 50 minutes would be out of reach. The fifty-two-year-old Coloradan was a connoisseur of the slow-drip torture of ultra-endurance challenges. She had pulled a 45-pound sled 430 miles across the frozen tundra to win the Yukon Arctic Ultra (second place was—well, no other woman finished); scaled the 22,838-foot peak of Aconcagua as part of a Mayo Clinic research expedition studying human limits; and racked up top finishes at grueling races of 100 miles or more around the world. Making the ferry, though, would require squeezing a relative sprint from her battered legs. She had been running from dawn to near-dawn for almost three weeks, sleeping one to three hours a night, barely pausing to let her North Face–supported crew team duct-tape her blistered feet and cram food into her mouth.
Fortunately, Van Deren had an advantage—or at least, a unique quirk that seemed to help her push past the corporeal limits that drag down most would-be ultramarathoners. At thirty-seven, she had undergone elective brain surgery to remove a golf-ball-sized chunk of her temporal cortex, the focal point of epileptic seizures that had plagued her, as often as two or three times a week, for years. The surgery successfully stopped the seizures but also left her with neurological deficits: poor memory, an impaired sense of direction, difficulty keeping track of time. A 2011 Runner’s World profile dubbed her “The Disoriented Express,” noting that “in races she must cover hundreds of miles, and yet often has no idea how long she has been running.” A significant handicap, you’d think—and yet it was only after the surgery that her racing career even started. To understand her extraordinary endurance, in other words, start with her brain.
The brain’s role in endurance is, perhaps, the single most controversial topic in sports science. It’s not that anyone thinks the brain doesn’t matter. Everyone, right back to A. V. Hill and other pioneers of the “body as machine” view, has always understood that the race is not always to the swift—particularly if the swift make bad tactical decisions, pace themselves poorly, or simply are unwilling to suffer. In that view, the body sets the limits, and the brain dictates how close you get to those boundaries. But starting in the late 1990s, a South African physician and scientist named Tim Noakes began to argue that this picture is insufficiently radical—that it’s actually the brain alone that sets and enforces the seemingly physical limits we encounter during prolonged exercise. The claim has profound and surprising implications, and the extent to which it’s true or false remains one of the most volatile flashpoints in exercise physiology, two decades later.
The particular tone of the controversy has as much to do with Noakes himself—an instinctive iconoclast who has been clashing with his scientific peers more or less continuously for four decades now—as with his ideas. “Tim is probably his own worst enemy,” says Carl Foster, the director of the University of Wisconsin–La Crosse’s Human Performance Laboratory and a former president of the American College of Sports Medicine, who counts Noakes as a friend. “He’s a very strong personality, and he gets these really neat, innovative ideas, but instead of saying, ‘Wow, I’ve found a better way to explain this,’ he says, ‘Everybody else is wrong.’” (Noakes, for his part, denies ever saying that everyone else is wrong. “Of course I believe they are wrong, but I am not about to tell them that,” he helpfully clarified in an email. “I just present what I believe is the truth.”) Either way, Foster acknowledges, if you want to challenge a century’s worth of textbook material, “maybe that stirring the pot is necessary.”
Noakes started out as a collegiate rower at the University of Cape Town, but his trajectory was altered one morning in the early 1970s when his rowing practice was canceled due to high winds. His teammates went home, but Noakes decided to stay and run around a nearby lake. After forty minutes, he was overcome by a feeling of euphoria—the classic but elusive runner’s high. Thanks in part to this quirk of brain chemistry, he quickly became hooked on the new sport, and ultimately shifted his professional interests from clinical medicine to running-related research. He went on to complete more than seventy marathon and ultra-marathon races, including seven finishes at South Africa’s famous 56-mile Comrades Marathon.
In the lab, meanwhile, his penchant for “paradigm-rattling,” as Foster calls it, emerged early. At a landmark gathering of sports scientists before the 1976 New York Marathon, at the height of the first jogging boom, most of the presentations focused on the incredible health benefits of running. Noakes, in contrast, presented the case report of an experienced marathoner who’d suffered a heart attack, puncturing the then-popular notion that marathoners were immune to clogged arteries. In 1981, he reported the case of Eleanor Sadler, a forty-six-year-old woman who collapsed during the Comrades Marathon, and diagnosed her problem as hyponatremia, a result of drinking too much, rather than the more common problem of drinking too little. It took another two decades—and a handful of deaths—before the scientific community fully acknowledged the dangers of overdrinking during exercise.
That same year, Noakes cofounded a dedicated sports science unit in the basement of the University of Cape Town’s physiology department, with a single stationary bicycle and a nearly obsolete treadmill. He and his colleagues began bringing athletes in and testing their maximal oxygen consumption—“because,” he says, “in 1981, to be a sports scientist, you had to have a VO2max machine, to measure VO2max.” But it didn’t take long for Noakes to grow dissatisfied with the insights provided by A. V. Hill’s signature measurement. One day in the lab’s early years, he tested track star Ricky Robinson and Comrades champion Isavel Roche-Kelly, less than an hour apart—and despite their vastly different racing speeds, they both recorded the same VO2max. Noakes’s conclusion: “Clearly the VO2max was totally useless, because here we had a sub-four-minute miler and it couldn’t say he was any better than the lady who could run a five-minute mile.”
Over the next decade, Noakes began searching for better ways of predicting and measuring endurance, and other ways of explaining the apparent limits runners like Robinson and Roche-Kelly encountered when they finally had to step off the treadmill at the end of a test to exhaustion. Hill and his successors had focused on oxygen: at your limits, your heart was incapable of pumping any more oxygen to your muscles, or your muscles were incapable of extracting any more oxygen from your bloodstream. Noakes’s first idea for an alternative to VO2max, in the late 1980s, was that the limits might reside in the contractility of the muscle fibers themselves, but that theory fizzled.
By the 1990s, Noakes had become an internationally renowned running guru, thanks to the enduring pop-sci classic Lore of Running, a 944-page doorstopper that first appeared in 1985. In 1996, he received one of the highest honors in the field of exercise physiology: an invitation to deliver the J. B. Wolffe Memorial Lecture at the annual meeting of the American College of Sports Medicine. True to his reputation, he decided to harangue his eminent audience about their stubborn adherence to the “ugly and creaking edifices” of old theories that were unsupported by “empirical science.” It was in preparing for this talk that he had his crucial epiphany about the rarity of deaths from exhaustion, like Henry Worsley’s. Whatever our limits are, something must prevent us from exceeding them by too much. And that something, he reasoned, must be the brain.
The history of brain research is, in some ways, a tale of unfortunate injuries and illnesses. Phineas Gage, for example, was a twenty-five-year-old construction foreman working on a new railway route in 1848 when a misfired explosive blast sent a 43-inch-long tamping iron rocketing up through his cheek and out the top of his skull. His survival was remarkable enough, but even more surprising were the alterations in his personality. A polite, competent man was suddenly transformed, through damage to his frontal lobes, to a profane and unreliable one: to his friends, the doctor who treated him reported, Gage was “no longer Gage.” Since then, we’ve learned a great deal about how the brain works by observing the distinctive changes that follow damage to different parts of the brain—an assortment of strange and mostly sad transformations of the type chronicled with tenderness and humanity by the late neurologist Oliver Sacks.
ñêà÷àòü êíèãó áåñïëàòíî