banner banner banner
The Forgetting: Understanding Alzheimer’s: A Biography of a Disease
The Forgetting: Understanding Alzheimer’s: A Biography of a Disease
Оценить:
Рейтинг: 0

Полная версия:

The Forgetting: Understanding Alzheimer’s: A Biography of a Disease

скачать книгу бесплатно


All of a sudden, everyone seemed to know someone touched by Alzheimer’s. Partly, this was due to a shift in public conception of senile dementia. Only in the mid-1970s had doctors started to realize that senility is not an inevitable process of brain aging and decay but a recognizable—and perhaps one day treatable—disorder. Gradually, this perception also started to seep into the general consciousness: Senility is a disease.

Since then, there had been a staggering rise in actual cases of Alzheimer’s, corresponding to a vast increase in the elderly population. People were now living much longer lives. Longer lives meant more cases of Alzheimer’s. Since 1975, the estimated number of Alzheimer’s cases in the U.S. had grown tenfold, from 500,000 to nearly 5 million. Worldwide, the total was probably about three times that figure. In the absence of a medical breakthrough, the gloomy trend would not only continue, but would also get much, much worse.

The Roman poet Virgil wrote in the first century B.C., “Time wastes all things, the mind, too.” He was partly right. Scientists do not believe that Alzheimer’s is an inevitable consequence of aging. Many people will never get the disease regardless of how long they live. But aging is by far the greatest risk factor. It is almost unheard of in people aged 20–39, and very uncommon (about one in 2,500) for people aged 40–59. For people in their sixties, the odds begin to get more worrisome. An estimated and so on have Alzheimer’s or a closely related dementia. The risk accelerates with age, to the point where dementia affects nearly half of those eighty-five and over.

• 1 percent of 65-year-olds

• 2 percent of 68-year-olds

• 3 percent of 70-year-olds

• 6 percent of 73-year-olds

• 9 percent of 75-year-olds

• 13 percent of 77-year-olds

So, as the twentieth century came to a close, a shadow legacy was rapidly becoming apparent—the dark, unintended consequence of the century’s great advances in hygiene, nutrition, and medicine. Life spans in industrialized nations had nearly doubled over the previous one hundred years, and the percentage of elderly among the general population had more than tripled. In the process, the number of cases of senile dementia mushroomed. A hundred years before, it had not even been a statistical blip. Paradoxically, in the full blush of medical progress of the twentieth century, it had blossomed into a major public health problem.

Most strikingly to social workers like Judy and Irving, the number of people who had Alzheimer’s and who knew they had Alzheimer’s had exploded. A huge portion of the newly diagnosed cases were in the very early stages of the disease. “This is something new in the field,” Irving explained. “Most people never before realized that there is an early stage of Alzheimer’s. I had worked with the more advanced stages, but when I came into this it was overwhelming for me. It’s very hard to get used to a normal person who happens to have dementia. It’s a whole different ballgame.”

Judy and Irving recognized, along with many others in the national Alzheimer’s community, that something had to be done to help this emerging new constituency: early-stage dementia sufferers still functioning well enough to fully understand what lay ahead. With the assistance of the Alzheimer’s Association, they formed a support group at Freund House. “Our goal,” explained Irving, “is to try to help these people live a quality life, to help them gain some coping mechanisms for their deficits, and to help them feel better as human beings.” While scientists did battle with this disease, victims and their families had the opposite task: to make a certain peace with it, to struggle to understand the loss, come to terms with it, create meaning out of it.

Alzheimer’s is what doctors call a disease of “insidious onset,” by which they mean that it has no definitive starting point. The plaques and tangles proliferate so slowly—over decades, perhaps—and silently that their damage can be nearly impossible to detect until they have made considerable progress. Part of the function of any early-stage support group must be to try to make sense of this strange new terrain that lies between healthy and demented. Where, in specific behavioral terms, is the person overshadowed by the disease?

Individually and collectively, the Freund House group was trying to find out, and to make sense of the answer. “My wife gets frustrated with me,” Arnie related to his fellow group members, “and she is right to be frustrated. She asks me to put a can in the recycling … and I don’t do it. She says, ‘I know this is because of your illness, that this is not you.’”

Sadie nodded her head in recognition. “My mother had this, too,” she said. “Now I know what it was like for my father to take care of her. We used to get so mad at him when he would be short with her.”

Coping with a particular disability was one thing; trying to cope with an ever-shifting invisible illness, though, was a challenge unique to Alzheimer’s disease. In this early period, the insidiousness itself was often the most troubling thing about the disease—arguably even a disease unto itself As a group, these new patients could gain a more confident understanding of their disease, and tackle issues that would seem impossibly difficult to one isolated, failing person.

Driving, for instance. The first big question they confronted right after forming the group was: Should they continue, in this blurry period of semi-normalcy, to pilot massive steel boxes at thirty and forty and fifty miles per hour down roads lined with bicycles and toddlers? Studies showed conclusively that Alzheimer’s is, overall, a major driving hazard. Bystanders had been killed by Alzheimer’s patients making a lapse in judgment or being overcome momentarily by confusion. But the law had not yet caught up with this reality. Even with a diagnosis, no doctor or judge had ever confiscated a license. Families were forced to decide on their own when driving was no longer appropriate.

Together, after much deliberation, the group decided that it had already become too dangerous. Collectively, they gave up this highly charged symbol of autonomy and competence. On this shaky new terrain, a person’s independence could no longer be taken for granted.

In the summer of 1984, at the age of eighty-five, E. B. White, the tender essayist and author of Charlotte’s Web, became waylaid by some form of dementia. It came on very swiftly. In August, he began to complain of some mild disorientation. “We didn’t pay much attention,” recalls his stepson, Roger Angell, “because he was a world-class hypochondriac.” But just a few weeks later. White was severely confused much of the time. By the following May, he was bedridden with full-on dementia, running in and out of vivid hallucinations and telling visitors, “So many dreams—it’s hard to pick out the right one.” He died just a few months after that, in October 1985.

An obituary in the New York Times reported White as having Alzheimer’s disease, but that appeared to miss the mark. In fact, he was never even informally diagnosed with the disease, and his symptoms strongly suggested another illness. The rapid onset of the confusion and the abrupt shift from one stage to the next were classic signs of multi-infarct dementia, the second-most common cause (15 percent) of senile dementia after Alzheimer’s (60 percent). Multi-infarct dementia is caused by a series of tiny strokes. Its victims can have much in common with those of Alzheimer’s, but the experience is not as much of an enigma. Its cause is known, somewhat treatable, and, to a certain extent, preventable (diet, exercise, and medication can have an enormous impact on risk of strokes). Its jerky, stepwise approach is easier to follow and understand as symptoms worsen.

Alzheimer’s disease is not abrupt. It sets in so gradually that its beginning is imperceptible. Creeping diseases blur the boundaries in such a way that they can undermine our basic assumptions of illness. Alzheimer’s drifts from one stage to the next in a slow-motion haze. The disease is so gradual in its progression that it has come to be formally defined by that insidiousness. This is one of the disease’s primary clinical features, one key way that Alzheimer’s can be distinguished from other types of dementia: those caused by strokes, brain tumor, underactive thyroid, and vitamin deficiency or imbalance in electrolytes, glucose, or calcium (all treatable and potentially reversible conditions).

It is also nearly impossible to officially diagnose. A definitive determination requires evidence of both plaques and tangles—which cannot be obtained without drilling into the patient’s skull, snipping a tiny piece of brain tissue, and examining it under a microscope. Brain biopsies are today considered far too invasive for a patient who does not face imminent danger. Thus—Kafka would have enjoyed this—as a general rule, Alzheimer’s sufferers must die before they can be definitively diagnosed. Until autopsy, the formal diagnosis can only be “probable Alzheimer’s.”

These days, a decent neuropsychologist can maneuver within this paradox—can make a diagnosis of probable Alzheimer’s with a confidence of about 90 percent—through a battery of tests. The process almost always begins with this simple quiz:

What is today’s date?

What day of the week is it?

What is the season?

What country are we in?

What city?

What neighborhood?

What building are we in?

What floor are we on?

I’m going to name three objects and I want you to repeat them back to me: street, banana, hammer.

I’d like you to count backwards from one hundred by seven. [Stop after five answers.]

Can you repeat back to me the three objects I mentioned a moment ago?

[Points at any object in the room.] What do we call this?

[Points at another object.] What do we call this?

Repeat after me: “No ifs, ands, or buts.”

Take this piece of paper in your right hand, fold it in half, and put it on the floor.

[Without speaking, doctor shows the patient a piece of paper with “CLOSE YOUR EYES” printed on it.]

Please write a sentence for me. It can say anything at all, but make it a complete sentence.

Here is a diagram of two intersecting pentagons. Please copy this drawing onto a plain piece of paper.

This neurological obstacle course is called the Mini Mental State Examination (MMSE). Introduced in 1975, it has been a part of the standard diagnostic repertoire ever since. The MMSE is crude but generally very effective in detecting problems with time and place orientation, object registration, abstract thinking, recall, verbal and written cognition, and constructional praxis. A person with normal functioning will score very close to the perfect thirty points (I scored twenty-nine, getting the date wrong). A person with early-to-moderate dementia will generally fall below twenty-four.

The very earliest symptoms in Alzheimer’s are short-term memory loss—the profound forgetting of incidents or conversations from just a few hours or the day before; fleeting spatial disorientation; trouble with words and arithmetic; and some impairment of judgment. Later on, in the middle stages of the disease, more severe memory problems are just a part of a full suite of cognitive losses. Following that, the late stages feature further cognitive loss and a series of progressive physical disabilities, ending in death.

One brilliantly simple exam, the Clock Test, can help foretell all of this and can enable a doctor to pinpoint incipient dementia in nine out of ten cases. In the Clock Test, the doctor instructs the patient to draw a clock on a piece of paper and then draw hands to a certain time. Neurologists have discovered that patients in the early stages of dementia tend to make many more errors of omission and misplacing of numbers on the clock than cognitively healthy people. They’re not entirely sure why this is, but the accuracy of the test speaks for itself.

A battery of other performance tests can help highlight and clarify neurological deficiencies. The Buschke Selective Reminding Test measures the subject’s short-term verbal memory. The Wisconsin Card Sorting Test gauges the ability to deduce sorting patterns. In the Trail Making Test, psychomotor skills are measured by timing a subject’s attempt to draw a line connecting consecutively numbered circles. Porteus Mazes measure planning and abstract-puzzle-solving ability.

If the patient performs poorly in a consistent fashion, the next step will likely involve elaborate instruments. Conveniently for physicians, Alzheimer’s disease always begins in the same place: a curved, two-inch-long, peapod-like structure in the brain’s temporal lobes called the hippocampus (the temporal lobes are located on either side of the head, inward from the ear). Doctors can get a good look at the hippocampus with a magnetic resonance imaging (MRI) scanner, which bombards the body with radio waves and measures the reflections off tissue. A simple volume measurement of the hippocampus will often show, even in the very early stages of Alzheimer’s, a pronounced decrease in volume, particularly in contrast with other brain structures. By itself, the MRI cannot diagnose Alzheimer’s. But it can add one more helpful piece to the diagnostic puzzle.

Other advanced measurements might also help: A positron emission tomography (PET) scan may detect a decrease in oxygen flow or glucose metabolism in the same area. A single photon emission computed tomography (SPECT) scan may catch decreases in blood flow. A moderate to severe amount of slowing in the alpha rhythm in an electroencephalogram (EEG) is often characteristic of dementia. But such measurements are generally not required for a tentative diagnosis. In the face of convincing results from memory and performance tests, and in the absence of any contravening evidence—disturbance in consciousness, extremely rapid onset of symptoms, preponderance of tremors or other muscular symptoms, difficulties with eye movements or reports of temporary blindness, seizures, depression, psychosis, head trauma, a history of alcoholism or drug abuse, any indication of diabetes, syphilis, or AIDS—a diagnosis of probable Alzheimer’s is rendered.

Alzheimer’s disease. The diagnosis is a side-impact collision of overwhelming force. It seems unreal and unjust. After coming up for air, the sufferer might ask, silently or out loud, “What have I done to deserve this?” The answer is, simply, nothing. “I remember walking out of the clinic and into a fresh San Diego night feeling like a very helpless and broken man,” recalled Bill, a fifty-four-year-old magazine editor, to writer Lisa Snyder. “I wondered if there was anything for me to live for.”

It can take a while to sink in. Experienced doctors know not to try to convey any other important information to a patient or family member on the same day that they disclose the diagnosis. They put some helpful information into a letter, and schedule a follow-up.

There is no cure for Alzheimer’s at the present time, and not much in the way of treatment. Historically, the one saving grace of the disease over the years has been that many, if not most, of the people who acquire the disease do not comprehend what is about to happen to them and their families. Now, for better or worse, that has changed. More and more are learning at the earliest possible opportunity what they have, and what it means.

What will they do with the advance knowledge? It is not an easy question. Will they use the time left to get their affairs in order and to prepare themselves emotionally for the long fade? Or will the knowledge only add to the frustration and force them into a psychological spiral to accompany the physiological one?

The Freund House early-stage support group was one experimental approach to tackling such unknowns. When Judy and Irving created it in 1997, they weren’t sure it would work. Could people struggling with memory loss, spatial disorientation, and confusion actually strike up a meaningful relationship with a group of strangers? They had to assemble just the right team. “We had to turn many people away,” said Judy, “because we didn’t feel they were right for a support group. They weren’t introspective enough. They weren’t bothered enough.”

The group was also temporary by design. As participants lost the ability to contribute, they would be eased out of the group, and perhaps admitted to a middle-stage group like the one that Judy ran down the hall. In that group, volunteer caregivers always accompanied patients to the restroom and back, because otherwise they would get lost. Most, not all, still responded to their own name. After a cafeteria-style lunch, everyone came together in a circle to sing fun songs together, like the theme from Barney:

I love you

You love me

We’re a happy family

Members of the early-stage group occasionally caught a glimpse of the middle-stage group as they passed by to get a cup of coffee. The quiet, desperate hope of everyone in this group was not to end up in the other group. Barring a scientific miracle, though, there would be no avoiding it. The average interval from diagnosis to death in Alzheimer’s disease is eight years.

In the meantime, there were a hundred small consolations. The early-stage group members had quickly come to rely on one another for help through this very strange ordeal. Sometimes barely able to remember from week to week, they had nevertheless become friends. They shared memories of movie stars and kosher butchers. They talked about travel and passed around pictures of grandchildren. They even talked politics.

“Greta, any comments on Giuliani?” Judy asked one afternoon.

Greta swatted an invisible bug away from her face. “Oh don’t get me started about him,” she said. “You know I can’t stand him.”

“Clinton, then? What does everyone think about Monica?”

Opinions ran the gamut. Ted, his hands shaking with a Parkinsonian tremor (it is not unusual for people to suffer from both Parkinson’s and Alzheimer’s), suggested that Clinton should resign because he lied directly to the American people. Greta, a lifelong subscriber to The Nation, thought that Clinton probably kissed Monica but that the whole issue was overblown. Sadie thought it was all a Republican scheme.

Doris had an opinion, too, but with her severe expressive aphasia—an inability to retrieve words—she had great difficulty making it known.

“Gore … President … I think … good leader … lies …”

She appeared to be aware of her thoughts and very clear on what she wanted to say. But the words were no longer accessible. This was especially painful to watch because, as everyone in the group knew by now, Doris had a forty-year-old son with cerebral palsy who was deaf The two were very close, and, as it happened, she was the only one in the family to have ever learned sign language. Now Doris’s aphasia was also wiping away that second and more vital language. She could no longer speak to her son, leaving him marooned.

It was now a few minutes after one o’clock, time to say good-bye for the week. Rides were arranged. Someone went to fetch William’s wife, a volunteer in the middle-stage group.

Robert seemed to be having a hard time of it. Just a moment before, he had been lucidly telling me about his family and his past. He’d had no problem relating how he was spirited out of Nazi Germany as a young boy, turned over to relatives in England and later in New York. I learned all about his children, their occupations and families, the cities they lived in. But now he was struggling to understand a piece of paper his wife had written out for him about getting home. To the undamaged brain, the instructions were fairly straightforward—Robert will be picked up by the car service at 1:15, and should be driven to his home at ___ Street.…—but he was having a lot of trouble making sense of it. Then there was the other problem. In the last half hour, he had told me how he eventually came to live in the Bronx, where he was introduced to his wife, a distant cousin. He had described how crowded that Bronx apartment was, and where else he had lived in the city as he’d grown older. But now, for the life of him, Robert could not remember where he had put his jacket.

It was on the back of his chair.

Very often I wander around looking for something which I know is very pertinent, but then after a while I forget about what it is I was looking for.… Once the idea is lost, everything is lost and I have nothing to do but wander around trying to figure out what it was that was so important earlier. You have to learn to be satisfied with what comes to you.

—C.S.H.

Harrisonburg, Virginia

Chapter 3 THE GOD WHO FORGOT AND THE MAN WHO COULD NOT (#ulink_c2de6709-1920-5d9f-8c2f-773732d22506)

There could be no happiness, cheerfulness, hope, pride, immediacy, without forgetfulness. The person in whom this apparatus of suppression is damaged, so that it stops working, can be compared … to a dyspeptic; he cannot “cope” with anything.

—FRIEDRICH NIETZSCHE

As found in the Pyramid Texts, from 2800 B.C., Ra was the Sun God, the creator of the universe and of all other gods. From his own saliva came air and moisture. From his tears came humankind and the river Nile. He was all-powerful and, of course, immortal—but still not immune to the ravages of time: Ra, the supreme God, became old and senile. He began to lose his wits, and became easy prey for usurpers.

Throughout recorded history, human beings have been celebrating the powers of memory and lamenting its frailties. “Worse than any loss in body,” wrote the Roman poet Juvenal in the first century A.D., “is the failing mind which forgets the names of slaves, and cannot recognize the face of the old friend who dined with him last night, nor those of the children whom he has begotten and brought up.”

It took several thousand years, though, for anyone to figure out how memory actually worked. Plato was among the first to suggest a mechanism. His notion was of a literal impression made upon the mind. “Let us suppose,” he wrote, “that every man has in his mind a block of wax of various qualities, the gift of Memory, the mother of the Muses; and on this he receives the seal or stamp of those sensations and perceptions which he wishes to remember. That which he succeeds in stamping is remembered and known by him as long as the impression lasts; but that, of which the impression is rubbed out or imperfectly made, is forgotten, and not known.”

Later came the ventricular theory of cognition, from Galen (129 – ca. 199 A.D.), Nemesius (fourth century), and St. Augustine (354–430). According to this notion, the three major functions of the brain—sensation, movement, and memory—were governed from three large, round fluid-filled sacs. Vital Spirit, a mysterious substance that also contained the human soul, was harbor to the swirl of memories.

From this model came cerebral localization, the theory that the various functions of the brain were each controlled by specialized “modules.” This model of specialization turned out to be generally correct (if radically different in the details from what Galen had imagined). In the early twentieth century, it emerged that the brain wasn’t really an organ so much as a collection of organs, dozens of structures interacting with one another in dazzling complexity. Deep in the center of the brain the amygdala regulates fear while the pituitary coordinates adrenaline and other hormones. Visual stimulus is processed in the occipital lobe, toward the rear of the skull. Perception of texture is mediated by Area One of the parietal lobe near the top of the head, while, just to the rear, the adjacent Area Two differentiates between the size and shape of objects and the position of joints. The prefrontal cortex, snuggled just behind the forehead, spurs self-determination. Broca’s area, near the eyes, enables speech. Wernicke’s area, above the ears, facilitates the understanding of speech.

The more researchers discovered about localization, though, the more they wondered about the specialized zone for memory. Where was it? If vision was in the back of the brain, texture on top, and so on, what region or regions controlled the formation of lasting impressions and the retrieval of those impressions?

Part of the answer came in 1953, when a Harvard-trained neurosurgeon named William Beecher Scoville performed experimental surgery on a twenty-seven-year-old patient known as H.M. He had been suffering from violent epileptic seizures since childhood, and in a last-ditch effort to give him a chance at a normal life, Scoville removed a small collection of structures, including the hippocampus, from the interior portion of his brain’s two temporal lobes. The surgery was a great success in that it significantly reduced the severity of H.M.’s epilepsy. But it was also a catastrophe in that it eliminated his ability to lay down new memories. The case revolutionized the study of memory, revealing that the hippocampus is essential in consolidating immediate thoughts and impressions into longer-lasting memories (which are in turn stored elsewhere).

Time stopped for H.M. in 1953. For the rest of his long life, he was never again able to learn a new name or face, or to remember a single new fact or thought. Many doctors, researchers, and caregivers got to know him quite well in the years that followed, but they were still forced to introduce themselves to him every time they entered his room. As far as H.M. was concerned, he was always a twenty-five-year-old man who was consulting a doctor about his epilepsy (he had also lost all memory of the two years immediately prior to the surgery). H.M. became perhaps the most important neurological subject in history and was subject to a vast number of studies, but he remembered none of the experiments once they were out of his immediate concentration. He was always in the Now.

In the clinical lexicon, this was a perfect case of anterograde amnesia, the inability to store any new memories. Persons with incipient Alzheimer’s disease exhibit a slightly less severe form of the same problem. The memory of leaving the car keys in the bathroom isn’t so much lost as it was never actually formed.

In a healthy brain, sensory input is converted into memory in three basic stages. Before the input even reaches consciousness, it is held for a fraction of a second in an immediate storage system called a sensory buffer.

Moments later, as the perception is given conscious attention, it passes into another very temporary system called short-term (working) memory. Information can survive there for seconds or minutes before dissolving away.

Some of the information stirring in working memory is captured by the mechanism that very slowly converts into a long-term memory lasting years and even a lifetime.

Long-term memories can be either episodic or semantic. Episodic memories are very personal memories of firsthand events remembered in order of occurrence. Before the baseball game the other day, I put on my new pair of sneakers, which I had gotten earlier that morning. Then we drove to the stadium. Then we parked. Then we gave the man our tickets. Then we bought some hot dogs. Then we went to our seats …

Now, days later, if I notice a mustard stain on my shoe, I can plumb my episodic memory to determine when and how it happened. If my feet start bothering me, my episodic memory will help me figure out whether it happened before or after I bought my new shoes.

Semantic memories are what we know, as opposed to what we remember doing. They are our facts about the world, stored in relation to each other and not when we learned them. The memory of Neville Chamberlain’s “peace in our time” is semantic.

They are separate systems—interrelated, but separate. An early-stage Alzheimer’s patient who cannot retain memories of where she put her keys has not forgotten what keys are for, or what kind of car she drives. That will come much, much later, when she starts to lose old semantic memories.

The experience with H.M. taught researchers that the hippocampus is key to long-term memory formation. Without that tiny organ, he was totally incapable of forming new, lasting memories. Alzheimer’s patients suffer the exact same systemic loss, but over several years rather than one surgical afternoon. For H.M., there were no new memories after 1953, period. In later years, he was unable to recognize his own face in the mirror. Real time had marched on, 1955 … 1962 … 1974, but as far as he was concerned, he was still twenty-five years old. If you are a young man, alert and intelligent, and you look into an ordinary mirror only to discover the face of a sixty-year-old perfectly mimicking your expressions, perhaps only then do you know the real meaning of the word horror. Fortunately, the extreme distress H.M. suffered during such world-shattering incidents was always immediately and completely forgotten as soon as his attention could be distracted by something happening in the new moment. Not remembering can sometimes be a great blessing.

The discovery of hippocampus-as-memory-consolidator was critical. What memory specialists have been trying to figure out ever since then is, once formed, where do these long-term memories actually reside? Are memories stored up in the front of the brain in the prefrontal cortex? On top, in the parietal lobe? In the brainstem at the base of the brain? Where?

One tantalizing theory emerged in the late 1950s: memories were everywhere, stored in discrete molecules scattered throughout the brain. A stampede to confirm this notion was set off by a 1962 Journal of Neuropsychiatry article, “Memory Transfer Through Cannibalism in Planaria,” in which the University of Michigan’s James McConnell eagerly reported that worms could capture specific memories of other worms simply by eating those worms. McConnell had trained a group of flatworms to respond to light in a noninstinctive way. He then killed these worms, chopped them up, and fed them to untrained flatworms. After eating their brethren, McConnell claimed, the untrained worms proceeded to behave as though they had been trained—they had somehow acquired the memory of the trained worms. It was the unexpected apotheosis of the old saying, “You are what you eat.”

Out of this report numerous research grants were born, some of which yielded tantalizing results. Three years after McConnell’s initial study, four California scientists reported in the journal Science that when cells extracted from the brains of trained rats were injected into the guts of untrained rats, the untrained rats picked up the learned behavior of the trained rats. These experiments apparently showed that specific, concrete individual memories were embedded as information in discrete molecules in the same way that genetic information is embedded in DNA, and that these memories were transferable from brain to brain. A later experiment by Baylor University’s Georges Ungar was the most vivid yet: Brain cells from rats that had been trained to fear the dark were transferred to untrained mice (ordinarily, neither mice nor rats fear the dark), who suddenly took on this new fear. Ungar even isolated a peptide comprising fifteen amino acids that he said contained the newly created memory. He called the transmissible fear-of-the-dark memory molecule scotophobin.

The theory that emerged out of these experiments was of memory as a distinct informational molecule that could be created organically in one brain, isolated, and then transferred to another brain—even to the brain of another species. Its implications were immense. Had this cold fusion of an idea been validated rather than widely discredited not long after Ungar’s paper was published in Nature in 1972, it is clear that ours would be a very different world today: Memory swaps. Consciousness transfers. Neurochemical behavioral enhancements that would make Prozac seem like baby aspirin. The rapid decoding of a hidden science of memory molecules might well have spawned a new type of biochemical computer that could register, react to, and even create memory molecules of its own. Laptops (or cars or stuffed animals) could be afraid of the dark or partial to jazz or concerned about child abuse. Memories and feelings could be bottled and sold as easily as perfume.

But that world did not, and cannot, emerge. The memory transfer experiments, while entertaining and even seductive—DNA pioneer Francis Crick was among the many prestigious scientists on board for a while—were ultimately dismissed as seriously misguided). The idea of transferable memories strained credulity to begin with; to suggest that one animal’s specific fear could travel through another animal’s digestive tract, enter its bloodstream, find its way to the brain, and turn itself on again in the new host mind was an even further stretch.

And then there was the problem of physical mass. Skeptics calculated that if specific memories were contained in molecules the way Ungar suggested, the total number of memories accumulated over a lifetime would weigh somewhere in the vicinity of 220 pounds. The brain would literally be weighed down by thought and ideas.

After a decade or so, the notion and burgeoning industry of memory molecules crumbled into dust. It is now one particularly humiliating memory that many neuroscientists would just as soon not retain. What has grown up out of that rubble over the last thirty years is a very different understanding of memory—not as a substance but as a system. Memories are scattered about; that part the memory molecularists had right. Memory is everywhere. But it is everywhere in such a way that it is impossible to point to any one spot and identify it with an explicit memory. We now know that memory, like consciousness itself, isn’t a thing that can be isolated or extracted, but a living process, a vast and dynamic interaction of neuronal synapses involved in what Harvard’s Daniel Schacter elegantly terms “a temporary constellation of activity.” Each specific memory is a unique network of neurons from different regions of the brain coordinating with one another. Schacter explains:

A typical incident in our everyday lives consists of numerous sights, sounds, actions, and words. Different areas of the brain analyze these various aspects of an event. As a result, neurons in the different regions become more strongly connected to one another. The new pattern of connections constitutes the brain’s record of the event.

The power of the constellation idea is reinforced by the understanding of just how connected the 100 billion neurons in the brain actually are. A. G. Cairns-Smith, of the University of Glasgow, observes that no single brain cell is separated from any other brain cell by more than six or seven intermediaries.