banner banner banner
How to be Alone
How to be Alone
Оценить:
Рейтинг: 0

Полная версия:

How to be Alone

скачать книгу бесплатно


There ensued some months of relative optimism. The cancer was eradicated, my mother’s knee finally improved, and her native hopefulness returned to her letters. She reported that my father had taken first place in a game of bridge: “With his confusion cleared up & his less conservative approach to the game he is doing remarkably well & it’s about the only thing he enjoys (& can stay awake for!).” But my father’s anxiety about his health did not abate; he had stomach pains that he was convinced were caused by cancer. Gradually, the import of the story my mother was telling me migrated from the personal and the moral toward the psychiatric. “The past six months we have lost so many friends it is very unsettling—part of Dad’s nervousness & depression I’m sure,” she wrote in February 1992. The letter continued:

Dad’s internist, Dr. Rouse, has about concluded what I have felt all a long regarding Dad’s stomach discomfort (he’s ruled out all clinical possibilities). Dad is (1) terribly nervous, (2) terribly depressed & I hope Dr. Rouse will put him on an anti-depressant. I know there has to be help for this … There have been disturbing, distressing things in our lives the past year, I know that very well, but Dad’s mental condition is hurting him physically & if he won’t go for counseling (suggested by Dr. Weiss) perhaps he now will accept pills or whatever it takes for nervousness & depression.

For a while, the phrase “nervousness & depression” was a fixture of her letters. Prozac briefly seemed to lift my father’s spirits, but the effects were short-lived. Finally, in July 1992, to my surprise, he agreed to see a psychiatrist.

My father had always been supremely suspicious of psychiatry. He viewed therapy as an invasion of privacy, mental health as a matter of self-discipline, and my mother’s increasingly pointed suggestions that he “talk to someone” as acts of aggression—little lobbed grenades of blame for their unhappiness as a couple. It was a measure of his desperation that he voluntarily set foot in a psychiatrist’s office.

In October, when I stopped in St. Louis on my way to Italy, I asked him about his sessions with the doctor. He made a hopeless gesture with his hands. “He’s extremely able,” he said. “But I’m afraid he’s written me off.”

The idea of anybody writing my father off was more than I could stand. From Italy I sent the psychiatrist a three-page appeal for reconsideration, but even as I was writing it the roof was caving in at home. “Much as I dislike telling you,” my mother wrote in a letter faxed to Italy, “Dad has regressed terribly. Medicine for the urinary problem a urologist is treating in combination with medication for depression and nervousness blew his mind again and the hallucinating, etc. was terrible.” There had been a weekend with my Uncle Erv in Indiana, where my father, removed from his familiar surroundings, unleashed a night of madness that culminated in my uncle’s shouting into his face, “Earl, my God, it’s your brother, Erv, we slept in the same bed!” Back in St. Louis, my father had begun to rage against the retired lady, Mrs. Pryble, whom my mother had engaged to sit with him two mornings a week while she ran errands. He didn’t see why he needed sitting, and, even assuming that he did need sitting, he didn’t see why a stranger, rather than his wife, should be doing it. He’d become a classic “sundowner,” dozing through the day and rampaging in the wee hours.

There followed a dismal holiday visit during which my wife and I finally intervened on my mother’s behalf and put her in touch with a geriatric social worker, and my mother urged my wife and me to tire my father out so that he would sleep through the night without psychotic incident, and my father sat stone-faced by the fireplace or told grim stories of his childhood while my mother fretted about the expense, the prohibitive expense, of sessions with a social worker. But even then, as far as I can remember, nobody ever said “dementia.” In all my mother’s letters to me, the word “Alzheimer’s” appears exactly once, in reference to an old German woman I worked for as a teenager.

I REMEMBER my suspicion and annoyance, fifteen years ago, when the term “Alzheimer’s disease” was first achieving currency. It seemed to me another instance of the medicalization of human experience, the latest entry in the ever-expanding nomenclature of victimhood. To my mother’s news about my old employer I replied: “What you describe sounds like the same old Erika, only quite a bit worse, and that’s not how Alzheimer’s is supposed to work, is it? I spend a few minutes every month fretting about ordinary mental illness being trendily misdiagnosed as Alzheimer’s.”

From my current vantage, where I spend a few minutes every month fretting about what a self-righteous thirty-year-old I was, I can see my reluctance to apply the term “Alzheimer’s” to my father as a way of protecting the specificity of Earl Franzen from the generality of a nameable condition. Conditions have symptoms; symptoms point to the organic basis of everything we are. They point to the brain as meat. And, where I ought to recognize that, yes, the brain is meat, I seem instead to maintain a blind spot across which I then interpolate stories that emphasize the more soul-like aspects of the self. Seeing my afflicted father as a set of organic symptoms would invite me to understand the healthy Earl Franzen (and the healthy me) in symptomatic terms as well—to reduce our beloved personalities to finite sets of neurochemical coordinates. Who wants a story of life like that?

Even now, I feel uneasy when I gather facts about Alzheimer’s. Reading, for example, David Shenk’s book The Forgetting: Alzheimer’s: Portrait of an Epidemic, I’m reminded that when my father got lost in his own neighborhood, or forgot to flush the toilet, he was exhibiting symptoms identical to those of millions of other afflicted people. There can be comfort in having company like this, but I’m sorry to see the personal significance drained from certain mistakes of my father’s, like his confusion of my mother with her mother, which struck me at the time as singular and orphic, and from which I gleaned all manner of important new insights into my parents’ marriage. My sense of private selfhood turns out to have been illusory.

Senile dementia has been around for as long as people have had the means of recording it. While the average human life span remained short and old age was a comparative rarity, senility was considered a natural by-product of aging—perhaps the result of sclerotic cerebral arteries. The young German neuropathologist Alois Alzheimer believed he was witnessing an entirely new variety of mental illness when, in 1901, he admitted to his clinic a fifty-one-year-old woman, Auguste D., who was suffering from bizarre mood swings and severe memory loss and who, in Alzheimer’s initial examination of her, gave problematic answers to his questions:

“What is your name?”

“Auguste.”

“Last name?”

“Auguste.”

“What is your husband’s name?”

“Auguste, I think.”

When Auguste D. died in an institution, four years later, Alzheimer availed himself of recent advances in microscopy and tissue-staining and was able to discern, in slides of her brain tissue, the striking dual pathology of her disease: countless sticky-looking globs of “plaque” and countless neurons engulfed by “tangles” of neuronal fibrils. Alzheimer’s findings greatly interested his patron Emil Kraepelin, then the dean of German psychiatry, who was engaged in a fierce scientific battle with Sigmund Freud and Freud’s psycholiterary theories of mental illness. To Kraepelin, Alzheimer’s plaques and tangles provided welcome clinical support for his contention that mental illness was fundamentally organic. In his Handbook of Psychiatry he dubbed Auguste D.’s condition Morbus Alzheimer.

For six decades after Alois Alzheimer’s autopsy of Auguste D., even as breakthroughs in disease prevention and treatment were adding fifteen years to life expectancy in developed nations, Alzheimer’s continued to be viewed as a medical rarity à la Huntington’s disease. David Shenk tells the story of an American neuropathologist named Meta Naumann who, in the early fifties, autopsied the brains of 210 victims of senile dementia and found sclerotic arteries in few of them, plaques and tangles in the majority. Here was ironclad evidence that Alzheimer’s was far more common than anyone had guessed; but Naumann’s work appears to have persuaded no one. “They felt that Meta was talking nonsense,” her husband recalled.

The scientific community simply wasn’t ready to consider that senile dementia might be more than a natural consequence of aging. In the early fifties there was no self-conscious category of “seniors,” no explosion of Sun Belt retirement communities, no AARP, no Early Bird tradition at low-end restaurants; and scientific thinking reflected these social realities. Not until the seventies did conditions become ripe for a reinterpretation of senile dementia. By then, as Shenk says, “so many people were living so long that senility didn’t feel so normal or acceptable anymore.” Congress passed the Research on Aging Act in 1974, and established the National Institute on Aging, for which funding soon mushroomed. By the end of the eighties, at the crest of my annoyance with the clinical term and its sudden ubiquity, Alzheimer’s had achieved the same social and medical standing as heart disease or cancer—and had the research funding levels to show for it.

What happened with Alzheimer’s in the seventies and eighties wasn’t simply a diagnostic paradigm shift. The number of new cases really is soaring. As fewer and fewer people drop dead of heart attacks or die of infections, more and more survive to become demented. Alzheimer’s patients in nursing homes live much longer than other patients, at a cost of at least forty thousand dollars annually per patient; until they’re institutionalized, they increasingly derange the lives of family members charged with caring for them. Already, five million Americans have the disease, and the number could rise to fifteen million by 2050.

Because there’s so much money in chronic illness, drug companies are investing feverishly in proprietary Alzheimer’s research while publicly funded scientists file for patents on the side. But because the science of the disease remains cloudy (a functioning brain is not a lot more accessible than the center of the earth or the edge of the universe), nobody can be sure which avenues of research will lead to effective treatments. Overall, the feeling in the field seems to be that if you’re under fifty you can reasonably expect to be offered effective drugs for Alzheimer’s by the time you need them. Then again, twenty years ago, many cancer researchers were predicting a cure within twenty years.

David Shenk, who is comfortably under fifty, makes the case in The Forgetting that a cure for senile dementia might not be an entirely unmitigated blessing. He notes, for example, that one striking peculiarity of the disease is that its “sufferers” often suffer less and less as it progresses. Caring for an Alzheimer’s patient is gruelingly repetitious precisely because the patient himself has lost the cerebral equipment to experience anything as a repetition. Shenk quotes patients who speak of “something delicious in oblivion” and who report an enhancement of their sensory pleasures as they come to dwell in an eternal, pastless Now. If your short-term memory is shot, you don’t remember, when you stoop to smell a rose, that you’ve been stooping to smell the same rose all morning.

As the psychiatrist Barry Reisberg first observed twenty years ago, the decline of an Alzheimer’s patient mirrors in reverse the neurological development of a child. The earliest capacities a child develops—raising the head (at one to three months), smiling (two to four months), sitting up unassisted (six to ten months)—are the last capacities an Alzheimer’s patient loses. Brain development in a growing child is consolidated through a process called myelinization, wherein the axonal connections among neurons are gradually strengthened by sheathings of the fatty substance myelin. Apparently, since the last regions of the child’s brain to mature remain the least myelinated, they’re the regions most vulnerable to the insult of Alzheimer’s. The hippocampus, which processes short-term memories into long-term, is very slow to myelinize. This is why we’re unable to form permanent episodic memories before the age of three or four, and why the hippocampus is where the plaques and tangles of Alzheimer’s first appear. Hence the ghostly apparition of the middle-stage patient who continues to be able to walk and feed herself even as she remembers nothing from hour to hour. The inner child isn’t inner anymore. Neurologically speaking, we’re looking at a one-year-old.

Although Shenk tries valiantly to see a boon in the Alzheimer’s patient’s childish relief from responsibility and childlike focus on the Now, I’m mindful that becoming a baby again was the last thing my father wanted. The stories he told from his childhood in northern Minnesota were mainly (as befits a depressive’s recollections) horrible: brutal father, unfair mother, endless chores, backwoods poverty, family betrayals, hideous accidents. He told me more than once, after his retirement, that his greatest pleasure in life had been going to work as an adult in the company of other men who valued his abilities. My father was an intensely private person, and privacy for him had the connotation of keeping the shameful content of one’s interior life out of public sight. Could there have been a worse disease for him than Alzheimer’s? In its early stages, it worked to dissolve the personal connections that had saved him from the worst of his depressive isolation. In its later stages it robbed him of the sheathing of adulthood, the means to hide the child inside him. I wish he’d had a heart attack instead.

Still, shaky though Shenk’s arguments for the brighter side of Alzheimer’s may be, his core contention is harder to dismiss: senility is not merely an erasure of meaning but a source of meaning. For my mother, the losses of Alzheimer’s both amplified and reversed long-standing patterns in her marriage. My father had always refused to open himself to her, and now, increasingly, he couldn’t open himself. To my mother, he remained the same Earl Franzen napping in the den and failing to hear. She, paradoxically, was the one who slowly and surely lost her self, living with a man who mistook her for her mother, forgot every fact he’d ever known about her, and finally ceased to speak her name. He, who had always insisted on being the boss in the marriage, the maker of decisions, the adult protector of the childlike wife, now couldn’t help behaving like the child. Now the unseemly outbursts were his, not my mother’s. Now she ferried him around town the way she’d once ferried me and my brothers. Task by task, she took charge of their life. And so, although my father’s “long illness” was a crushing strain and disappointment to her, it was also an opportunity to grow slowly into an autonomy she’d never been allowed: to settle some very old scores.

As for me, once I accepted the scope of the disaster, the sheer duration of Alzheimer’s forced me into unexpectedly welcome closer contact with my mother. I learned, as I might not have otherwise, that I could seriously rely on my brothers and that they could rely on me. And, strangely, although I’d always prized my intelligence and sanity and self-consciousness, I found that watching my father lose all three made me less afraid of losing them myself. I became a little less afraid in general. A bad door opened, and I found I was able to walk through it.

THE DOOR in question was on the fourth floor of Barnes Hospital, in St. Louis. About six weeks after my wife and I had put my mother in touch with the social worker and gone back east, my oldest brother and my father’s doctors persuaded him to enter the hospital for testing. The idea was to get all the medications out of his bloodstream and see what we were dealing with underneath. My mother helped him check in and spent the afternoon settling him into his room. He was still his usual, semipresent self when she left for dinner, but that evening, at home, she began to get calls from the hospital, first from my father, who demanded that she come and remove him from “this hotel,” and then from nurses who reported that he’d become belligerent. When she returned to the hospital in the morning, she found him altogether gone—raving mad, profoundly disoriented.

I flew back to St. Louis a week later. My mother took me straight from the airport to the hospital. While she spoke to the nurses, I went to my father’s room and found him in bed, wide awake. I said hello. He made frantic shushing gestures and beckoned me to his pillow. I leaned over him and he asked me, in a husky whisper, to keep my voice down because “they” were “listening.” I asked him who “they” were. He couldn’t tell me, but his eyes rolled fearfully to scan the room, as if he’d lately seen “them” everywhere and were puzzled by “their” disappearance. When my mother appeared in the doorway, he confided to me, in an even lower whisper, “I think they’ve gotten to your mother.”

My memories of the week that followed are mainly a blur, punctuated by a couple of life-changing scenes. I went to the hospital every day and sat with my father for as many hours as I could stand. At no point did he string together two coherent sentences. The memory that appears to me most significant in hindsight is a very peculiar one. It’s lit by a dreamlike indoor twilight, it’s set in a hospital room whose orientation and cramped layout are unfamiliar from any of my other memories, and it returns to me now without any of the chronological markers that usually characterize my memories. I’m not sure it even dates from that first week I saw my father in the hospital. And yet I’m sure that I’m not remembering a dream. All memories, the neuroscientists say, are actually memories of memory, but usually they don’t feel that way. Here’s one that does. I remember remembering: my father in bed, my mother sitting beside it, me standing near the door. We’ve been having an anguished family conversation, possibly about where to move my father after his discharge from the hospital. It’s a conversation that my father, to the slight extent that he can follow it, is hating. Finally he cries out with passionate emphasis, as if he’s had enough of all the nonsense, “I have always loved your mother. Always.” And my mother buries her face in her hands and sobs.

This was the only time I ever heard my father say he loved her. I’m certain the memory is legitimate because the scene seemed to me immensely significant even at the time, and I then described it to my wife and brothers and incorporated it into the story I was telling myself about my parents. In later years, when my mother insisted that my father had never said he loved her, not even once, I asked if she remembered that time in the hospital. I repeated what he’d said, and she shook her head uncertainly. “Maybe,” she said. “Maybe he did. I don’t remember that.”

My brothers and I took turns going to St. Louis every few months. My father never failed to recognize me as someone he was happy to see. His life in a nursing home appeared to be an endless troubled dream populated by figments from his past and by his deformed and brain-damaged fellow inmates; his nurses were less like actors in the dream than like unwelcome intruders on it. Unlike many of the female inmates, who at one moment were wailing like babies and at the next moment glowing with pleasure while someone fed them ice cream, I never saw my father cry, and the pleasure he took in ice cream never ceased to look like an adult’s. He gave me significant nods and wistful smiles as he confided to me fragments of nonsense to which I nodded as if I understood. His most consistently near-coherent theme was his wish to be removed from “this hotel” and his inability to understand why he couldn’t live in a little apartment and let my mother take care of him.

For Thanksgiving that year, my mother and my wife and I checked him out of the nursing home and brought him home with a wheelchair in my Volvo station wagon. He hadn’t been in the house since he’d last been living there, ten months earlier. If my mother had been hoping for a gratifying show of pleasure from him, she was disappointed; by then, a change of venue no more impressed my father than it does a one-year-old. We sat by the fireplace and, out of unthinking, wretched habit, took pictures of a man who, if he knew nothing else, seemed full of unhappy knowledge of how dismal a subject for photographs he was. The images are awful to me now: my father listing in his wheelchair like an unstrung marionette, eyes mad and staring, mouth sagging, glasses smeared with strobe light and nearly falling off his nose; my mother’s face a mask of reasonably well-contained despair; and my wife and I flashing grotesquely strained smiles as we reach to touch my father. At the dinner table my mother spread a bath towel over my father and cut his turkey into little bites. She kept asking him if he was happy to be having Thanksgiving dinner at home. He responded with silence, shifting eyes, sometimes a faint shrug. My brothers called to wish him a happy holiday; and here, out of the blue, he mustered a smile and a hearty voice, he was able to answer simple questions, he thanked them both for calling.

This much of the evening was typically Alzheimer’s. Because children learn social skills very early, a capacity for gestures of courtesy and phrases of vague graciousness survives in many Alzheimer’s patients long after their memories are shot. It wasn’t so remarkable that my father was able to handle (sort of) my brothers’ holiday calls. But consider what happened next, after dinner, outside the nursing home. While my wife ran inside for a geri chair, my father sat beside me and studied the institutional portal that he was about to reenter. “Better not to leave,” he told me in a clear, strong voice, “than to have to come back.” This was not a vague phrase; it pertained directly to the situation at hand, and it strongly suggested an awareness of his larger plight and his connection to the past and future. He was requesting that he be spared the pain of being dragged back toward consciousness and memory. And, sure enough, on the morning after Thanksgiving, and for the remainder of our visit, he was as crazy as I ever saw him, his words a hash of random syllables, his body a big flail of agitation.

For David Shenk, the most important of the “windows onto meaning” afforded by Alzheimer’s is its slowing down of death. Shenk likens the disease to a prism that refracts death into a spectrum of its otherwise tightly conjoined parts—death of autonomy, death of memory, death of self-consciousness, death of personality, death of body—and he subscribes to the most common trope of Alzheimer’s: that its particular sadness and horror stem from the sufferer’s loss of his or her “self” long before the body dies.

This seems mostly right to me. By the time my father’s heart stopped, I’d been mourning him for years. And yet, when I consider his story, I wonder whether the various deaths can ever really be so separated, and whether memory and consciousness have such secure title, after all, to the seat of selfhood. I can’t stop looking for meaning in the two years that followed his loss of his supposed “self,” and I can’t stop finding it.

I’m struck, above all, by the apparent persistence of his will. I’m powerless not to believe that he was exerting some bodily remnant of his self-discipline, some reserve of strength in the sinews beneath both consciousness and memory, when he pulled himself together for the request he made to me outside the nursing home. I’m powerless as well not to believe that his crash on the following morning, like his crash on his first night alone in a hospital, amounted to a relinquishment of that will, a letting-go, an embrace of madness in the face of unbearable emotion. Although we can fix the starting point of his decline (full consciousness and sanity) and the end point (oblivion and death), his brain wasn’t simply a computational device running gradually and inexorably amok. Where the subtractive progress of Alzheimer’s might predict a steady downward trend like this—

what I saw of my father’s fall looked more like this:

He held himself together longer, I suspect, than it might have seemed he had the neuronal wherewithal to do. Then he collapsed and fell lower than his pathology may have strictly dictated, and he chose to stay low, ninety-nine percent of the time. What he wanted (in the early years, to stay clear; in the later years, to let go) was integral to what he was. And what I want (stories of my father’s brain that are not about meat) is integral to what I choose to remember and retell.

One of the stories I’ve come to tell, then, as I try to forgive myself for my long blindness to his condition, is that he was bent on concealing that condition and, for a remarkably long time, retained the strength of character to bring it off. My mother used to swear that this was so. He couldn’t fool the woman he lived with, no matter how he bullied her, but he could pull himself together as long he had sons in town or guests in the house. The true solution of the conundrum of my stay with him during my mother’s operation probably has less to do with my blindness than with the additional will he was exerting.

After the bad Thanksgiving, when we knew he was never coming home again, I helped my mother sort through his desk. (It’s the kind of liberty you take with the desk of a child or a dead person.) In one of his drawers we found evidence of small, covert endeavors not to forget. There was a sheaf of papers on which he’d written the addresses of his children, one address per slip, the same address on several. On another slip he’d written the birth dates of his older sons—“Bob 1–13–48” and “TOM 10–15–50”—and then, in trying to recall mine (August 17, 1959), he had erased the month and day and made a guess on the basis of my brothers’ dates: “JON 10–13–49.”

Consider, too, what I believe are the last words he ever spoke to me, three months before he died. For a couple of days, I’d been visiting the nursing home for a dutiful ninety minutes and listening to his mutterings about my mother and to his affable speculations about certain tiny objects that he persisted in seeing on the sleeves of his sweater and the knees of his pants. He was no different when I dropped by on my last morning, no different when I wheeled him back to his room and told him I was heading out of town. But then he raised his face toward mine and—again, out of nowhere, his voice was clear and strong—he said: “Thank you for coming. I appreciate your taking the time to see me.”

Set phrases of courtesy? A window on his fundamental self? I seem to have little choice about which version to believe.

IN RELYING ON MY MOTHER’S LETTERS to reconstruct my father’s disintegration, I feel the shadow of the undocumented years after 1992, when she and I talked on the phone at greater length and ceased to write all but the briefest notes. Plato’s description of writing, in the Phaedrus, as a “crutch of memory” seems to me fully accurate: I couldn’t tell a clear story of my father without those letters. But, where Plato laments the decline of the oral tradition and the atrophy of memory which writing induces, I at the other end of the Age of the Written Word am impressed by the sturdiness and reliability of words on paper. My mother’s letters are truer and more complete than my self-absorbed and biased memories; she’s more alive to me in the written phrase “he NEEDS distractions!” than in hours of videotape or stacks of pictures of her.

The will to record indelibly, to set down stories in permanent words, seems to me akin to the conviction that we are larger than our biologies. I wonder if our current cultural susceptibility to the charms of materialism—our increasing willingness to see psychology as chemical, identity as genetic, and behavior as the product of bygone exigencies of human evolution—isn’t intimately related to the postmodern resurgence of the oral and the eclipse of the written: our incessant telephoning, our ephemeral e-mailing, our steadfast devotion to the flickering tube.

Have I mentioned that my father, too, wrote letters? Usually typewritten, usually prefaced with an apology for misspellings, they came much less frequently than my mother’s. One of the last is from December 1987:

This time of the year is always difficult for me. I’m ill at ease with all the gift-giving, as I would love to get things for people but lack the imagination to get the right things. I dread the shopping for things that are the wrong size or the wrong color or something not needed, and anticipate the problems of returning or exchanging. I like to buy tools, but Bob pointed out a problem with this category, when for some occasion I gave him a nice little hammer with good balance, and his comment was that this was the second or third hammer and I don’t need any more, thank you. And then there is the problem of gifts for your mother. She is so sentimental that it hurts me not to get her something nice, but she has access to my checking account with no restrictions. I have told her to buy something for herself, and say it is from me, so she can compete with the after-Christmas comment: “See what I got from my husband!” But she won’t participate in that fraud. So I suffer through the season.

In 1989, as his powers of concentration waned with his growing “nervousness & depression,” my father stopped writing letters altogether. My mother and I were therefore amazed to find, in the same drawer in which he’d left those addresses and birth dates, an unsent letter dated January 22, 1993—unimaginably late, a matter of weeks before his final breakdown. The letter was in an envelope addressed to my nephew Nick, who, at age six, had just begun to write letters himself. Possibly my father was ashamed to send a letter that he knew wasn’t fully coherent; more likely, given the state of his hippocampal health, he simply forgot. The letter, which for me has become an emblem of invisibly heroic exertions of the will, is written in a tiny penciled script that keeps veering away from the horizontal:

Dear Nick,

We got your letter a couple days ago and were pleased to see how well you were doing in school, particularly in math. It is important to write well, as the ability to exchange ideas will govern the use that one country can make of another country’s ideas.

Most of your nearest relatives are good writers, and thereby took the load off me. I should have learned better how to write, but it is so easy to say, Let Mom do it.

I know that my writing will not be easy to read, but I have a problem with the nerves in my legs and tremors in my hands. In looking at what I have written, I expect you will have difficulty to understand, but with a little luck, I may keep up with you.

We have had a change in the weather from cold and wet to dry with fair blue skies. I hope it stays this way. Keep up the good work.

Love, Grandpa

P.S. Thank you for the gifts.

MY FATHER’S HEART and lungs were very strong, and my mother was bracing herself for two or three more years of endgame when, one day in April 1995, he stopped eating. Maybe he was having trouble swallowing, or maybe, with his remaining shreds of will, he’d resolved to put an end to his unwanted second childhood.

His blood pressure was seventy over palpable when I flew into town. Again, my mother took me straight to the nursing home from the airport. I found him curled up on his side under a thin sheet, breathing shallowly, his eyes shut loosely. His muscle had wasted away, but his face was smooth and calm and almost entirely free of wrinkles, and his hands, which had changed not at all, seemed curiously large in comparison to the rest of him. There’s no way to know if he recognized my voice, but within minutes of my arrival his blood pressure climbed to 120/90. I worried then, worry even now, that I made things harder for him by arriving: that he’d reached the point of being ready to die but was ashamed to perform such a private or disappointing act in front of one of his sons.

My mother and I settled into a rhythm of watching and waiting, one of us sleeping while the other sat in vigil. Hour after hour, my father lay unmoving and worked his way toward death; but when he yawned, the yawn was his. And his body, wasted though it was, was likewise still radiantly his. Even as the surviving parts of his self grew ever smaller and more fragmented, I persisted in seeing a whole. I still loved, specifically and individually, the man who was yawning in that bed. And how could I not fashion stories out of that love—stories of a man whose will remained intact enough to avert his face when I tried to clear his mouth out with a moist foam swab? I’ll go to my own grave insisting that my father was determined to die and to die, as best he could, on his own terms.

We, for our part, were determined that he not be alone when he died. Maybe this was exactly wrong, maybe all he was waiting for was to be left alone. Nevertheless, on my sixth night in town, I stayed up and read a light novel cover to cover while he lay and breathed and loosed his great yawns. A nurse came by, listened to his lungs, and told me he must never have been a smoker. She suggested that I go home to sleep, and she offered to send in a particular nurse from the floor below to visit him. Evidently, the nursing home had a resident angel of death with a special gift for persuading the nearly dead, after their relatives had left for the night, that it was OK for them to die. I declined the nurse’s offer and performed this service myself. I leaned over my father, who smelled faintly of acetic acid but was otherwise clean and warm. Identifying myself, I told him that whatever he needed to do now was fine by me, he should let go and do what he needed to do.

Late that afternoon, a big early-summer St. Louis wind kicked up. I was scrambling eggs when my mother called from the nursing home and told me to hurry over. I don’t know why I thought I had plenty of time, but I ate the eggs with some toast before I left, and in the nursing-home parking lot I sat in the car and turned up the radio, which was playing the Blues Traveler song that was all the rage that season. No song has ever made me happier. The great white oaks all around the nursing home were swaying and turning pale in the big wind. I felt as though I might fly away with happiness.

And still he didn’t die. The storm hit the nursing home in the middle of the evening, knocking out all but the emergency lighting, and my mother and I sat in the dark. I don’t like to remember how impatient I was for my father’s breathing to stop, how ready to be free of him I was. I don’t like to imagine what he was feeling as he lay there, what dim or vivid sensory or emotional forms his struggle took inside his head. But I also don’t like to believe that there was nothing.

Toward ten o’clock, my mother and I were conferring with a nurse in the doorway of his room, not long after the lights came back on, when I noticed that he was drawing his hands up toward his throat. I said, “I think something is happening.” It was agonal breathing: his chin rising to draw air into his lungs after his heart had stopped beating. He seemed to be nodding very slowly and deeply in the affirmative. And then nothing.

After we’d kissed him goodbye and signed the forms that authorized the brain autopsy, after we’d driven through flooding streets, my mother sat down in our kitchen and uncharacteristically accepted my offer of undiluted Jack Daniel’s. “I see now,” she said, “that when you’re dead you’re really dead.” This was true enough. But, in the slow-motion way of Alzheimer’s, my father wasn’t much deader now than he’d been two hours or two weeks or two months ago. We’d simply lost the last of the parts out of which we could fashion a living whole. There would be no new memories of him. The only stories we could tell now were the ones we already had.

[2001]

IMPERIAL BEDROOM (#ulink_f2f1c8c4-8fa0-5d1e-a59a-bd21da1c142c)

PRIVACY, privacy, the new American obsession: espoused as the most fundamental of rights, marketed as the most desirable of commodities, and pronounced dead twice a week.

Even before Linda Tripp pressed the “Record” button on her answering machine, commentators were warning us that “privacy is under siege,” that “privacy is in a dreadful state,” that “privacy as we now know it may not exist in the year 2000.” They say that both Big Brother and his little brother, John Q. Public, are shadowing me through networks of computers. They tell me that security cameras no bigger than spiders are watching from every shaded corner, that dour feminists are monitoring bedroom behavior and watercooler conversations, that genetic sleuths can decoct my entire being from a droplet of saliva, that voyeurs can retrofit ordinary camcorders with a filter that lets them see through people’s clothing. Then comes the flood of dirty suds from the Office of the Independent Counsel, oozing forth through official and commercial channels to saturate the national consciousness. The Monica Lewinsky scandal marks, in the words of the philosopher Thomas Nagel, “the culmination of a disastrous erosion” of privacy; it represents, in the words of the author Wendy Kaminer, “the utter disregard for privacy and individual autonomy that exists in totalitarian regimes.” In the person of Kenneth Starr, the “public sphere” has finally overwhelmed—shredded, gored, trampled, invaded, run roughshod over—“the private.”

The panic about privacy has all the finger-pointing and paranoia of a good old American scare, but it’s missing one vital ingredient: a genuinely alarmed public. Americans care about privacy mainly in the abstract. Sometimes a well-informed community unites to defend itself, as when Net users bombarded the White House with e-mails against the “clipper chip,” and sometimes an especially outrageous piece of news provokes a national outcry, as when the Lotus Development Corporation tried to market a CD-ROM containing financial profiles of nearly half the people in the country. By and large, though, even in the face of wholesale infringements like the war on drugs, Americans remain curiously passive. I’m no exception. I read the editorials and try to get excited, but I can’t. More often than not, I find myself feeling the opposite of what the privacy mavens want me to. It’s happened twice in the last month alone.

On the Saturday morning when the Times came carrying the complete text of the Starr report, what I felt as I sat alone in my apartment and tried to eat my breakfast was that my own privacy—not Clinton’s, not Lewinsky’s—was being violated. I love the distant pageant of public life. I love both the pageantry and the distance. Now a President was facing impeachment, and as a good citizen I had a duty to stay informed about the evidence, but the evidence here consisted of two people’s groping, sucking, and mutual self-deception. What I felt, when this evidence landed beside my toast and coffee, wasn’t a pretend revulsion to camouflage a secret interest in the dirt; I wasn’t offended by the sex qua sex; I wasn’t worrying about a potential future erosion of my own rights; I didn’t feel the President’s pain in the empathic way he’d once claimed to feel mine; I wasn’t repelled by the revelation that public officials do bad things; and, although I’m a registered Democrat, my disgust was of a different order from my partisan disgust at the news that the Giants have blown a fourth-quarter lead. What I felt I felt personally. I was being intruded on.

A couple of days later, I got a call from one of my credit-card providers, asking me to confirm two recent charges at a gas station and one at a hardware store. Queries like this are common nowadays, but this one was my first, and for a moment I felt eerily exposed. At the same time, I was perversely flattered that someone, somewhere, had taken an interest in me and had bothered to phone. Not that the young male operator seemed to care about me personally. He sounded like he was reading his lines from a laminated booklet. The strain of working hard at a job he almost certainly didn’t enjoy seemed to thicken his tongue. He tried to rush his words out, to speed through them as if in embarrassment or vexation at how nearly worthless they were, but they kept bunching up in his teeth, and he had to stop and extract them with his lips, one by one. It was the computer, he said, the computer that routinely, ah, scans the, you know, the pattern of charges … and was there something else he could help me with tonight? I decided that if this young person wanted to scroll through my charges and ponder the significance of my two fill-ups and my gallon of latex paint, I was fine with it.

So here’s the problem. On the Saturday morning the Starr Report came out, my privacy was, in the classic liberal view, absolute. I was alone in my home and unobserved, unbothered by neighbors, unmentioned in the news, and perfectly free, if I chose, to ignore the report and do the pleasantly al dente Saturday crossword; yet the report’s mere existence so offended my sense of privacy that I could hardly bring myself to touch the thing. Two days later, I was disturbed in my home by a ringing phone, asked to cough up my mother’s maiden name, and made aware that the digitized minutiae of my daily life were being scrutinized by strangers; and within five minutes I’d put the entire episode out of my mind. I felt encroached on when I was ostensibly safe, and I felt safe when I was ostensibly encroached on. And I didn’t know why.

THE RIGHT to privacy—defined by Louis Brandeis and Samuel Warren, in 1890, as “the right to be let alone”—seems at first glance to be an elemental principle in American life. It’s the rallying cry of activists fighting for reproductive rights, against stalkers, for the right to die, against a national health-care database, for stronger data-encryption standards, against paparazzi, for the sanctity of employee e-mail, and against employee drug testing. On closer examination, though, privacy proves to be the Cheshire cat of values: not much substance, but a very winning smile.

Legally, the concept is a mess. Privacy violation is the emotional core of many crimes, from stalking and rape to Peeping Tommery and trespass, but no criminal statute forbids it in the abstract. Civil law varies from state to state but generally follows a forty-year-old analysis by the legal scholar Dean William Prosser, who dissected the invasion of privacy into four torts: intrusion on my solitude, the publishing of private facts about me which are not of legitimate public concern, publicity that puts my character in a false light, and appropriation of my name or likeness without my consent. This is a crumbly set of torts. Intrusion looks a lot like criminal trespass, false light like defamation, and appropriation like theft; and the harm that remains when these extraneous offenses are subtracted is so admirably captured by the phrase “infliction of emotional distress” as to render the tort of privacy invasion all but superfluous. What really undergirds privacy is the classical liberal conception of personal autonomy or liberty. In the last few decades, many judges and scholars have chosen to speak of a “zone of privacy,” rather than a “sphere of liberty,” but this is a shift in emphasis, not in substance: not the making of a new doctrine but the repackaging and remarketing of an old one.

Whatever you’re trying to sell, whether it’s luxury real estate or Esperanto lessons, it helps to have the smiling word “private” on your side. Last winter, as the owner of a Bank One Platinum Visa Card, I was offered enrollment in a program called PrivacyGuard

, which, according to the literature promoting it, “puts you in the know about the very personal records available to your employer, insurers, credit card companies, and government agencies.” The first three months of PrivacyGuard

were free, so I signed up. What came in the mail then was paperwork: envelopes and request forms for a Credit Record Search and other searches, also a disappointingly undeluxe logbook in which to jot down the search results. I realized immediately that I didn’t care enough about, say, my driving records to wait a month to get them; it was only when I called PrivacyGuard

to cancel my membership, and was all but begged not to, that I realized that the whole point of this “service” was to harness my time and energy to the task of reducing Bank One Visa’s fraud losses.

Even issues that legitimately touch on privacy are rarely concerned with the actual emotional harm of unwanted exposure or intrusion. A proposed national Genetic Privacy Act, for example, is premised on the idea that my DNA reveals more about my identity and future health than other medical data do. In fact, DNA is as yet no more intimately revealing than a heart murmur, a family history of diabetes, or an inordinate fondness for Buffalo chicken wings. As with any medical records, the potential for abuse of genetic information by employers and insurers is chilling, but this is only tangentially a privacy issue; the primary harm consists of things like job discrimination and higher insurance premiums.

In a similar way, the problem of online security is mainly about nuts and bolts. What American activists call “electronic privacy” their European counterparts call “data protection.” Our term is exciting; theirs is accurate. If someone is out to steal your Amex number and expiration date, or if an evil ex-boyfriend is looking for your new address, you need the kind of hard-core secrecy that encryption seeks to guarantee. If you’re talking to a friend on the phone, however, you need only a feeling of privacy.

The social drama of data protection goes something like this: a hacker or an insurance company or a telemarketer gains access to a sensitive database, public-interest watchdogs bark loudly, and new firewalls go up. Just as most people are moderately afraid of germs but leave virology to the Centers for Disease Control, most Americans take a reasonable interest in privacy issues but leave the serious custodial work to experts. Our problem now is that the custodians have started speaking a language of panic and treating privacy not as one of many competing values but as the one value that trumps all others.

The novelist Richard Powers recently declared in a Times op-ed piece that privacy is a “vanishing illusion” and that the struggle over the encryption of digital communications is therefore as “great with consequence” as the Cold War. Powers defines “the private” as “that part of life that goes unregistered,” and he sees in the digital footprints we leave whenever we charge things the approach of “that moment when each person’s every living day will become a Bloomsday, recorded in complete detail and reproducible with a few deft keystrokes.” It is scary, of course, to think that the mystery of our identities might be reducible to finite data sequences. That Powers can seriously compare credit-card fraud and intercepted cell-phone calls to thermonuclear incineration, however, speaks mainly to the infectiousness of privacy panic. Where, after all, is it “registered” what Powers or anybody else is thinking, seeing, saying, wishing, planning, dreaming, and feeling ashamed of? A digital Ulysses consisting of nothing but a list of its hero’s purchases and other recordable transactions might run, at most, to four pages: was there really nothing more to Bloom’s day?

When Americans do genuinely sacrifice privacy, moreover, they do so for tangible gains in health or safety or efficiency. Most legalized infringements—HIV notification, airport X-rays, Megan’s Law, Breathalyzer roadblocks, the drug-testing of student athletes, laws protecting fetuses, laws protecting the vegetative, remote monitoring of automobile emissions, county-jail strip searches, even Ken Starr’s exposure of presidential corruption—are essentially public health measures. I resent the security cameras in Washington Square, but I appreciate the ones on a subway platform. The risk that someone is abusing my E-ZPass toll records seems to me comfortably low in comparison with my gain in convenience. Ditto the risk that some gossip rag will make me a victim of the First Amendment; with two hundred and seventy million people in the country, any individual’s chances of being nationally exposed are next to nil.

The legal scholar Lawrence Lessig has characterized Americans as “bovine” for making calculations like this and for thereby acquiescing in what he calls the “Sovietization” of personal life. The curious thing about privacy, though, is that simply by expecting it we can usually achieve it. One of my neighbors in the apartment building across the street spends a lot of time at her mirror examining her pores, and I can see her doing it, just as she can undoubtedly see me sometimes. But our respective privacies remain intact as long as neither of us feels seen. When I send a postcard through the U.S. mail, I’m aware in the abstract that mail handlers may be reading it, may be reading it aloud, may even be laughing at it, but I’m safe from all harm unless, by sheer bad luck, the one handler in the country whom I actually know sees the postcard and slaps his forehead and says, “Oh, jeez, I know this guy.”

OUR PRIVACY panic isn’t merely exaggerated. It’s founded on a fallacy. Ellen Alderman and Caroline Kennedy, in The Right to Privacy, sum up the conventional wisdom of privacy advocates like this: “There is less privacy than there used to be.” The claim has been made or implied so often, in so many books and editorials and talk-show dens, that Americans, no matter how passive they are in their behavior, now dutifully tell pollsters that they’re very much worried about privacy. From almost any historical perspective, however, the claim seems bizarre.

In 1890, an American typically lived in a small town under conditions of near-panoptical surveillance. Not only did his every purchase “register,” but it registered in the eyes and the memory of shopkeepers who knew him, his parents, his wife, and his children. He couldn’t so much as walk to the post office without having his movements tracked and analyzed by neighbors. Probably he grew up sleeping in the same bed with his siblings and possibly with his parents, too. Unless he was well off, his transportation—a train, a horse, his own two feet—either was communal or exposed him to the public eye.

In the suburbs and exurbs where the typical American lives today, tiny nuclear families inhabit enormous houses, in which each person has his or her own bedroom and, sometimes, bathroom. Compared even with suburbs in the sixties and seventies, when I was growing up, the contemporary condominium development or gated community offers a striking degree of anonymity. It’s no longer the rule that you know your neighbors. Communities increasingly tend to be virtual, the participants either faceless or firmly in control of the face they present. Transportation is largely private: the latest SUVs are the size of living rooms and come with onboard telephones, CD players, and TV screens; behind the tinted windows of one of these high-riding I-see-you-but-you-can’t-see-me mobile PrivacyGuard® units, a person can be wearing pajamas or a licorice bikini, for all anybody knows or cares. Maybe the government intrudes on the family a little more than it did a hundred years ago (social workers look in on the old and the poor, health officials require inoculations, the police inquire about spousal battery), but these intrusions don’t begin to make up for the small-town snooping they’ve replaced.

The “right to be left alone”? Far from disappearing, it’s exploding. It’s the essence of modern American architecture, landscape, transportation, communication, and mainstream political philosophy. The real reason that Americans are apathetic about privacy is so big as to be almost invisible: we’re flat-out drowning in privacy.

What’s threatened, then, isn’t the private sphere. It’s the public sphere. Much has been made of the discouraging effect that the Starr investigation may have on future aspirants to public office (only zealots and zeros need apply), but that’s just half of it. The public world of Washington, because it’s public, belongs to everyone. We’re all invited to participate with our votes, our patriotism, our campaigning, and our opinions. The collective weight of a population makes possible our faith in the public world as something larger and more enduring and more dignified than any messy individual can be in private. But, just as one sniper in a church tower can keep the streets of an entire town empty, one real grossout scandal can undermine that faith.

If privacy depends upon an expectation of invisibility, the expectation of visibility is what defines a public space. My “sense of privacy” functions to keep the public out of the private and to keep the private out of the public. A kind of mental Border collie yelps in distress when I feel that the line between the two has been breached. This is why the violation of a public space is so similar, as an experience, to the violation of privacy. I walk past a man taking a leak on a sidewalk in broad daylight (delivery-truck drivers can be especially self-righteous in their “Ya gotta go, ya gotta go” philosophy of bladder management), and although the man with the yawning fly is ostensibly the one whose privacy is compromised by the leak, I’m the one who feels the impingement. Flashers and sexual harassers and fellators on the pier and self-explainers on the crosstown bus all similarly assault our sense of the “public” by exposing themselves.

Since really serious exposure in public today is assumed to be synonymous with being seen on television, it would seem to follow that televised space is the premier public space. Many things that people say to me on television, however, would never be tolerated in a genuine public space—in a jury box, for example, or even on a city sidewalk. TV is an enormous, ramified extension of the billion living rooms and bedrooms in which it’s consumed. You rarely hear a person on the subway talking loudly about, say, incontinence, but on television it’s been happening for years. TV is devoid of shame, and without shame there can be no distinction between public and private. Last winter, an anchorwoman looked me in the eye and, in the tone of a close female relative, referred to a litter of babies in Iowa as “America’s seven little darlin’s.” It was strange enough, twenty-five years ago, to get Dan Rather’s reports on Watergate between spots for Geritol and Bayer aspirin, as if Nixon’s impending resignation were somehow located in my medicine chest. Now, shelved between ads for Promise margarine and Celebrity Cruises, the news itself is a soiled cocktail dress—TV the bedroom floor and nothing but.

Reticence, meanwhile, has become an obsolete virtue. People now readily name their diseases, rents, antidepressants. Sexual histories get spilled on first dates, Birkenstocks and cutoffs infiltrate the office on casual Fridays, telecommuting puts the boardroom in the bedroom, “softer” modern office design puts the bedroom in the boardroom, salespeople unilaterally address customers by their first name, waiters won’t bring me food until I’ve established a personal relationship with them, voice-mail machinery stresses the “I” in “I’m sorry, but I don’t understand what you dialed,” and cyberenthusiasts, in a particularly grotesque misnomer, designate as “public forums” pieces of etched silicon with which a forum’s unshaved “participant” may communicate while sitting crosslegged in tangled sheets. The networked world as a threat to privacy? It’s the ugly spectacle of a privacy triumphant.

A genuine public space is a place where every citizen is welcome to be present and where the purely private is excluded or restricted. One reason that attendance at art museums has soared in recent years is that museums still feel public in this way. After those tangled sheets, how delicious the enforced decorum and the hush, the absence of in-your-face consumerism. How sweet the promenading, the seeing and being seen. Everybody needs a promenade sometimes—a place to go when you want to announce to the world (not the little world of friends and family but the big world, the real world) that you have a new suit, or that you’re in love, or that you suddenly realize you stand a full inch taller when you don’t hunch your shoulders.

Unfortunately, the fully public place is a nearly extinct category. We still have courtrooms and the jury pool, commuter trains and bus stations, here and there a small-town Main Street that really is a main street rather than a strip mall, certain coffee bars, and certain city sidewalks. Otherwise, for American adults, the only halfway public space is the world of work. Here, especially in the upper echelons of business, codes of dress and behavior are routinely enforced, personal disclosures are penalized, and formality is still the rule. But these rituals extend only to the employees of the firm, and even they, when they become old, disabled, obsolete, or outsourceable, are liable to be expelled and thereby relegated to the tangled sheets.

The last big, steep-walled bastion of public life in America is Washington, D.C. Hence the particular violation I felt when the Starr Report crashed in. Hence the feeling of being intruded on. It was privacy invasion, all right: private life brutally invading the most public of public spaces. I don’t want to see sex on the news from Washington. There’s sex everywhere else I look—on sitcoms, on the Web, on dust jackets, in car ads, on the billboards at Times Square. Can’t there be one thing in the national landscape that isn’t about the bedroom? We all know there’s sex in the cloakrooms of power, sex behind the pomp and circumstance, sex beneath the robes of justice; but can’t we act like grownups and pretend otherwise? Pretend not that “no one is looking” but that everyone is looking?

For two decades now, business leaders and politicians across much of the political spectrum, both Gingrich Republicans and Clinton Democrats, have extolled the virtues of privatizing public institutions. But what better word can there be for Lewinskygate and the ensuing irruption of disclosures (the infidelities of Helen Chenoweth, of Dan Burton, of Henry Hyde) than “privatization”? Anyone who wondered what a privatized presidency might look like may now, courtesy of Mr. Starr, behold one.

IN DENIS JOHNSON’S SHORT STORY “Beverly Home,” the young narrator spends his days working at a nursing home for the hopelessly disabled, where there is a particularly unfortunate patient whom no one visits:

A perpetual spasm forced him to perch sideways on his wheelchair and peer down along his nose at his knotted fingers. This condition had descended on him suddenly. He got no visitors. His wife was divorcing him. He was only thirty-three, I believe he said, but it was hard to guess what he told about himself because he really couldn’t talk anymore, beyond clamping his lips repeatedly around his protruding tongue while groaning.

No more pretending for him! He was completely and openly a mess. Meanwhile the rest of us go on trying to fool each other.

In a coast-to-coast, shag-carpeted imperial bedroom, we could all just be messes and save ourselves the trouble of pretending. But who wants to live in a pajama-party world? Privacy loses its value unless there’s something it can be defined against. “Meanwhile the rest of us go on trying to fool each other”—and a good thing, too. The need to put on a public face is as basic as the need for the privacy in which to take it off. We need both a home that’s not like a public space and a public space that’s not like home.

Walking up Third Avenue on a Saturday night, I feel bereft. All around me, attractive young people are hunched over their StarTacs and Nokias with preoccupied expressions, as if probing a sore tooth, or adjusting a hearing aid, or squeezing a pulled muscle; personal technology has begun to look like a personal handicap. All I really want from a sidewalk is that people see me and let themselves be seen, but even this modest ideal is thwarted by cell-phone users and their unwelcome privacy. They say things like “Should we have couscous with that?” and “I’m on my way to Blockbuster.” They aren’t breaking any law by broadcasting these breakfast-nook conversations. There’s no PublicityGuard that I can buy, no expensive preserve of public life to which I can flee. Seclusion, whether in a suite at the Plaza or in a cabin in the Catskills, is comparatively effortless to achieve. Privacy is protected as both commodity and right; public forums are protected as neither. Like old-growth forests, they’re few and irreplaceable and should be held in trust by everyone. The work of maintaining them gets only harder as the private sector grows ever more demanding, distracting, and disheartening. Who has the time and energy to stand up for the public sphere? What rhetoric can possibly compete with the American love of “privacy”?