banner banner banner
The Information: A History, a Theory, a Flood
The Information: A History, a Theory, a Flood
Оценить:
Рейтинг: 0

Полная версия:

The Information: A History, a Theory, a Flood

скачать книгу бесплатно


horizon, a circle, deviding the halfe of the firmament, from the other halfe which we see not

zodiack, a circle in the heaven, wherein be placed the 12 signes, and in which the Sunne is mooved

Not just the words but the knowledge was in flux. The language was examining itself. Even when Cawdrey is copying from Coote or Thomas, he is fundamentally alone, with no authority to consult.

One of Cawdrey’s hard usual words was science (“knowledge, or skill”). Science did not yet exist as an institution responsible for learning about the material universe and its laws. Natural philosophers were beginning to have a special interest in the nature of words and their meaning. They needed better than they had. When Galileo pointed his first telescope skyward and discovered sunspots in 1611, he immediately anticipated controversy— traditionally the sun was an epitome of purity—and he sensed that science could not proceed without first solving a problem of language:

So long as men were in fact obliged to call the sun “most pure and most lucid,” no shadows or impurities whatever had been perceived in it; but now that it shows itself to us as partly impure and spotty; why should we not call it “spotted and not pure”? For names and attributes must be accommodated to the essence of things, and not the essence to the names, since things come first and names afterwards.

When Isaac Newton embarked on his great program, he encountered a fundamental lack of definition where it was most needed. He began with a semantic sleight of hand: “I do not define time, space, place, and motion, as being well known to all,” he wrote deceptively. Defining these words was his very purpose. There were no agreed standards for weights and measures. Weight and measure were themselves vague terms. Latin seemed more reliable than English, precisely because it was less worn by everyday use, but the Romans had not possessed the necessary words either. Newton’s raw notes reveal a struggle hidden in the finished product. He tried expressions like quantitas materiae. Too hard for Cawdrey: “materiall, of some matter, or importance.” Newton suggested (to himself) “that which arises from its density and bulk conjointly.” He considered more words: “This quantity I designate under the name of body or mass.” Without the right words he could not proceed. Velocity, force, gravity—none of these were yet suitable. They could not be defined in terms of one another; there was nothing in visible nature at which anyone could point a finger; and there was no book in which to look them up.

As for Robert Cawdrey, his mark on history ends with the publication of his Table Alphabeticall in 1604. No one knows when he died. No one knows how many copies the printer made. There are no records (“records, writings layde up for remembrance”). A single copy made its way to the Bodleian Library in Oxford, which has preserved it. All the others disappeared. A second edition appeared in 1609, slightly expanded (“much inlarged,” the title page claims falsely) by Cawdrey’s son, Thomas, and a third and fourth appeared in 1613 and 1617, and there the life of this book ended.

It was overshadowed by a new dictionary, twice as comprehensive, An English Expositour: Teaching the Interpretation of the hardest Words used in our Language, with sundry Explications, Descriptions, and Discourses. Its compiler, John Bullokar, otherwise left as faint a mark on the historical record as Cawdrey did. He was doctor of physic; he lived for some time in Chichester; his dates of birth and death are uncertain; he is said to have visited London in 1611 and there to have seen a dead crocodile; and little else is known. His Expositour appeared in 1616 and went through several editions in the succeeding decades. Then in 1656 a London barrister, Thomas Blount, published his Glossographia: or a Dictionary, Interpreting all such Hard Words of Whatsoever Language,now used in our refined English Tongue. Blount’s dictionary listed more than eleven thousand words, many of which, he recognized, were new, reaching London in the hurly-burly of trade and commerce—

coffa or cauphe, a kind of drink among the Turks and Persians, (and of late introduced among us) which is black, thick and bitter, destrained from Berries of that nature, and name, thought good and very wholesom: they say it expels melancholy.

—or home-grown, such as “tom-boy, a girle or wench that leaps up and down like a boy.” He seems to have known he was aiming at a moving target. The dictionary maker’s “labor,” he wrote in his preface, “would find no end, since our English tongue daily changes habit.” Blount’s definitions were much more elaborate than Cawdrey’s, and he tried to provide information about the origins of words as well.

Neither Bullokar nor Blount so much as mentioned Cawdrey. He was already forgotten. But in 1933, upon the publication of the greatest word book of all, the first editors of the Oxford English Dictionary did pay their respects to his “slim, small volume.” They called it “the original acorn” from which their oak had grown. (Cawdrey: “akecorne, k fruit.”)

Four hundred and two years after the Table Alphabeticall, the International Astronomical Union voted to declare Pluto a nonplanet, and John Simpson had to make a quick decision. He and his band of lexicographers in Oxford were working on the P’s. Pletzel, plish, pod person, point-and-shoot, and polyamorous were among the new words entering the OED. The entry for Pluto was itself relatively new. The planet had been discovered only in 1930, too late for the OED’s first edition. The name Minerva was first proposed and then rejected because there was already an asteroid Minerva. In terms of names, the heavens were beginning to fill up. Then “Pluto” was suggested by Venetia Burney, an eleven-year-old resident of Oxford. The OED caught up by adding an entry for Pluto in its second edition: “1. A small planet of the solar system lying beyond the orbit of Neptune . . . 2. The name of a cartoon dog that made its first appearance in Walt Disney’s Moose Hunt, released in April 1931.”

“We really don’t like being pushed into megachanges,” Simpson said, but he had little choice. The Disney meaning of Pluto had proved more stable than the astronomical sense, which was downgraded to “small planetary body.” Consequences rippled through the OED. Pluto was removed from the list under planet n. 3a. Plutonian was revised (not to be confused with pluton, plutey, or plutonyl ).

Simpson was the sixth in a distinguished line, the editors of the Oxford English Dictionary, whose names rolled fluently off his tongue— “Murray, Bradley, Craigie, Onions, Burchfield, so however many fingers that is”—and saw himself as a steward of their traditions, as well as traditions of English lexicography extending back to Cawdrey by way of Samuel Johnson. James Murray in the nineteenth century established a working method based on index cards, slips of paper 6 inches by 4 inches. At any given moment a thousand such slips sat on Simpson’s desk, and within a stone’s throw were millions more, filling metal files and wooden boxes with the ink of two centuries. But the word-slips had gone obsolete. They had become treeware. Treeware had just entered the OED as “computing slang, freq. humorous”; blog was recognized in 2003, dot-commer in 2004, cyberpet in 2005, and the verb to Google in 2006. Simpson himself Googled often. Beside the word-slips his desk held conduits into the nervous system of the language: instantaneous connection to a worldwide network of proxy amateur lexicographers and access to a vast, interlocking set of databases growing asymptotically toward the ideal of All Previous Text. The dictionary had met cyberspace, and neither would be the same thereafter. However much Simpson loved the OED’s roots and legacy, he was leading a revolution, willy-nilly—in what it was, what it knew, what it saw. Where Cawdrey had been isolated, Simpson was connected.

The English language, spoken now by more than a billion people globally, has entered a period of ferment, and the perspective available in these venerable Oxford offices is both intimate and sweeping. The language upon which the lexicographers eavesdrop has become wild and amorphous: a great, swirling, expanding cloud of messaging and speech; newspapers, magazines, pamphlets; menus and business memos; Internet news groups and chat-room conversations; television and radio broadcasts and phonograph records. By contrast, the dictionary itself has acquired the status of a monument, definitive and towering. It exerts an influence on the language it tries to observe. It wears its authoritative role reluctantly. The lexicographers may recall Ambrose Bierce’s sardonic century-old definition: “dictionary, a malevolent literary device for cramping the growth of a language and making it hard and inelastic.” Nowadays they stress that they do not presume (or deign) to disapprove any particular usage or spelling. But they cannot disavow a strong ambition: the goal of completeness. They want every word, all the lingo: idioms and euphemisms, sacred or profane, dead or alive, the King’s English or the street’s. It is an ideal only: the constraints of space and time are ever present and, at the margins, the question of what qualifies as a word can become impossible to answer. Still, to the extent possible, the OED is meant to be a perfect record, perfect mirror of the language.

The dictionary ratifies the persistence of the word. It declares that the meanings of words come from other words. It implies that all words, taken together, form an interlocking structure: interlocking, because all words are defined in terms of other words. This could never have been an issue in an oral culture, where language was barely visible. Only when printing—and the dictionary—put the language into separate relief, as an object to be scrutinized, could anyone develop a sense of word meaning as interdependent and even circular. Words had to be considered as words, representing other words, apart from things. In the twentieth century, when the technologies of logic advanced to high levels, the potential for circularity became a problem. “In giving explanations I already have to use language full blown,” complained Ludwig Wittgenstein. He echoed Newton’s frustration three centuries earlier, but with an extra twist, because where Newton wanted words for nature’s laws, Wittgenstein wanted words for words: “When I talk about language (words, sentences, etc.) I must speak the language of every day. Is this language somehow too coarse and material for what we want to say?” Yes. And the language was always in flux.

James Murray was speaking of the language as well as the book when he said, in 1900, “The English Dictionary, like the English Constitution, is the creation of no one man, and of no one age; it is a growth that has slowly developed itself adown the ages.” The first edition of what became the OED was one of the largest books that had ever been made: A New English Dictionary on Historical Principles, 414,825 words in ten weighty volumes, presented to King George V and President Calvin Coolidge in 1928. The work had taken decades; Murray himself was dead; and the dictionary was understood to be out of date even as the volumes were bound and sewn. Several supplements followed, but not till 1989 did the second edition appear: twenty volumes, totaling 22,000 pages. It weighed 138 pounds. The third edition is different. It is weightless, taking its shape in the digital realm. It may never again involve paper and ink. Beginning in the year 2000, a revision of the entire contents began to appear online in quarterly installments, each comprising several thousand revised entries and hundreds of new words.

Cawdrey had begun work naturally enough with the letter A, and so had James Murray in 1879, but Simpson chose to begin with M. He was wary of the A’s. To insiders it had long been clear that the OED as printed was not a seamless masterpiece. The early letters still bore scars of the immaturity of the uncertain work in Murray’s first days. “Basically he got here, sorted his suitcases out and started setting up text,” Simpson said. “It just took them a long time to sort out their policy and things, so if we started at A, then we’d be making our job doubly difficult. I think they’d sorted themselves out by . . . well, I was going to say D, but Murray always said that E was the worst letter, because his assistant, Henry Bradley, started E, and Murray always said that he did that rather badly. So then we thought, maybe it’s safe to start with G, H. But you get to G and H and there’s I, J, K, and you know, you think, well, start after that.”

The first thousand entries from M to mahurat went online in the spring of 2000. A year later, the lexicographers reached words starting with me: me-ism (a creed for modern times), meds (colloq. for drugs), medspeak (doctors’ jargon), meet-and-greet (a N. Amer. type of social occasion), and an assortment of combined forms under media (baron, circus, darling, hype, savvy) and mega- (pixel, bitch, dose, hit, trend). This was no longer a language spoken by 5 million mostly illiterate inhabitants of a small island. As the OED revised the entries letter by letter, it also began adding neologisms wherever they arose; waiting for the alphabetical sequence became impractical. Thus one installment in 2001 saw the arrival of acid jazz, Bollywood, channel surfing, double-click, emoticon, feel-good, gangsta, hyperlink, and many more. Kool-Aid was recognized as a new word, not because the OED feels obliged to list proprietary names (the original Kool-Ade powdered drink had been patented in the United States in 1927) but because a special usage could no longer be ignored: “to drink the Kool-Aid: to demonstrate unquestioning obedience or loyalty.” The growth of this peculiar expression since the use of a powdered beverage in a mass poisoning in Guyana in 1978 bespoke a certain density of global communication.

But they were no slaves to fashion, these Oxford lexicographers. As a rule a neologism needs five years of solid evidence for admission to the canon. Every proposed word undergoes intense scrutiny. The approval of a new word is a solemn matter. It must be in general use, beyond any particular place of origin; the OED is global, recognizing words from everywhere English is spoken, but it does not want to capture local quirks. Once added, a word cannot come out. A word can go obsolete or rare, but the most ancient and forgotten words have a way of reappearing—rediscovered or spontaneously reinvented—and in any case they are part of the language’s history. All 2,500 of Cawdrey’s words are in the OED, perforce. For thirty-one of them Cawdrey’s little book was the first known usage. For a few Cawdrey is all alone. This is troublesome. The OED is irrevocably committed. Cawdrey, for example, has “onust, loaden, overcharged”; so the OED has “loaded, burdened,” but it is an outlier, a one-off. Did Cawdrey make it up? “I’m tending towards the view that he was attempting to reproduce vocabulary he had heard or seen,” Simpson said. “But I can’t be absolutely sure.” Cawdrey has “hallucinate, to deceive, or blind”; the OED duly gave “to deceive” as the first sense of the word, though it never found anyone else who used it that way. In cases like these, the editors can add their double caveat “Obs. rare.” But there it is.

For the twenty-first-century OED a single source is never enough. Strangely, considering the vastness of the enterprise and its constituency, individual men and women strive to have their own nonce-words ratified by the OED. Nonce-word, in fact, was coined by James Murray himself. He got it in. An American psychologist, Sondra Smalley, coined the word codependency in 1979 and began lobbying for it in the eighties; the editors finally drafted an entry in the nineties, when they judged the word to have become established. W. H. Auden declared that he wanted to be recognized as an OED word coiner—and he was, at long last, for motted, metalogue, spitzy, and others. The dictionary had thus become engaged in a feedback loop. It inspired a twisty self-consciousness in the language’s users and creators. Anthony Burgess whinged in print about his inability to break through: “I invented some years ago the word amation, for the art or act of making love, and still think it useful. But I have to persuade others to use it in print before it is eligible for lexicographicizing (if that word exists)”—he knew it did not. “T. S. Eliot’s large authority got the shameful (in my view) juvescence into the previous volume of the Supplement.” Burgess was quite sure that Eliot simply misspelled juvenescence. If so, the misspelling was either copied or reprised twenty-eight years later by Stephen Spender, so juvescence has two citations, not one. The OED admits that it is rare.

As hard as the OED tries to embody the language’s fluidity, it cannot help but serve as an agent of its crystallization. The problem of spelling poses characteristic difficulties. “Every form in which a word has occurred throughout its history” is meant to be included. So for mackerel (“a well-known sea-fish, Scomber scombrus, much used for food”) the second edition in 1989 listed nineteen alternative spellings. The unearthing of sources never ends, though, so the third edition revised entry in 2002 listed no fewer than thirty: maccarel, mackaral, mackarel, mackarell, mackerell, mackeril, mackreel, mackrel, mackrell, mackril, macquerel, macquerell, macrel, macrell, macrelle, macril, macrill, makarell, makcaral, makerel, makerell, makerelle, makral, makrall, makreill, makrel, makrell, makyrelle, maquerel, and maycril. As lexicographers, the editors would never declare these alternatives to be wrong: misspellings. They do not wish to declare their choice of spelling for the headword, mackerel, to be “correct.” They emphasize that they examine the evidence and choose “the most common current spelling.” Even so, arbitrary considerations come into play: “Oxford’s house style occasionally takes precedence, as with verbs which can end -ize or -ise, where the -ize spelling is always used.” They know that no matter how often and how firmly they disclaim a prescriptive authority, a reader will turn to the dictionary to find out how a word should be spelled. They cannot escape inconsistencies. They feel obliged to include words that make purists wince. A new entry as of December 2003 memorialized nucular: “= nuclear a. (in various senses).” Yet they refuse to count evident misprints found by way of Internet searches. They do not recognize straight-laced, even though statistical evidence finds that bastardized form outnumbering strait-laced. For the crystallization of spelling, the OED offers a conventional explanation: “Since the invention of the printing press, spelling has become much less variable, partly because printers wanted uniformity and partly because of a growing interest in language study during the Renaissance.” This is true. But it omits the role of the dictionary itself, arbitrator and exemplar.

For Cawdrey the dictionary was a snapshot; he could not see past his moment in time. Samuel Johnson was more explicitly aware of the dictionary’s historical dimension. He justified his ambitious program in part as a means of bringing a wild thing under control—the wild thing being the language, “which, while it was employed in the cultivation of every species of literature, has itself been hitherto neglected; suffered to spread, under the direction of chance, into wild exuberance; resigned to the tyranny of time and fashion; and exposed to the corruptions of ignorance, and caprices of innovation.” Not until the OED, though, did lexicography attempt to reveal the whole shape of a language across time. The OED becomes a historical panorama. The project gains poignancy if the electronic age is seen as a new age of orality, the word breaking free from the bonds of cold print. No publishing institution better embodies those bonds, but the OED, too, tries to throw them off. The editors feel they can no longer wait for a new word to appear in print, let alone in a respectably bound book, before they must take note. For tighty-whities (men’s underwear), new in 2007, they cite a typescript of North Carolina campus slang. For kitesurfer, they cite a posting to the Usenet newsgroup alt.kite and later a New Zealand newspaper found via an online database. Bits in the ether.

When Murray began work on the new dictionary, the idea was to find the words, and with them the signposts to their history. No one had any idea how many words were there to be found. By then the best and most comprehensive dictionary of English was American: Noah Webster’s, seventy thousand words. That was a baseline. Where were the rest to be discovered? For the first editors of what became the OED, it went almost without saying that the source, the wellspring, should be the literature of the language—particularly the books of distinction and quality. The dictionary’s first readers combed Milton and Shakespeare (still the single most quoted author, with more than thirty thousand references), Fielding and Swift, histories and sermons, philosophers and poets. Murray announced in a famous public appeal in 1879:

A thousand readers are wanted. The later sixteenth-century literature is very fairly done; yet here several books remain to be read. The seventeenth century, with so many more writers, naturally shows still more unexplored territory.

He considered the territory to be large but bounded. The founders of the dictionary explicitly meant to find every word, however many that would ultimately be. They planned a complete inventory. Why should they not? The number of books was unknown but not unlimited, and the number of words in those books was countable. The task seemed formidable but finite.

It no longer seems finite. Lexicographers are accepting the language’s boundlessness. They know by heart Murray’s famous remark: “The circle of the English language has a well-defined centre but no discernable circumference.” In the center are the words everyone knows. At the edges, where Murray placed slang and cant and scientific jargon and foreign border crossers, everyone’s sense of the language differs and no one’s can be called “standard.”

Murray called the center “well defined,” but infinitude and fuzziness can be seen there. The easiest, most common words—the words Cawdrey had no thought of including—require, in the OED, the most extensive entries. The entry for make alone would fill a book: it teases apart ninety-eight distinct senses of the verb, and some of these senses have a dozen or more subsenses. Samuel Johnson saw the problem with these words and settled on a solution: he threw up his hands.

My labor has likewise been much increased by a class of verbs too frequent in the English language, of which the signification is so loose and general, the use so vague and indeterminate, and the senses detorted so widely from the first idea, that it is hard to trace them through the maze of variation, to catch them on the brink of utter inanity, to circumscribe them by any limitations, or interpret them by any words of distinct and settled meaning; such are bear, break, come, cast, full, get, give, do, put, set, go, run, make, take, turn, throw. If of these the whole power is not accurately delivered, it must be remembered, that while our language is yet living, and variable by the caprice of every one that speaks it, these words are hourly shifting their relations, and can no more be ascertained in a dictionary, than a grove, in the agitation of a storm, can be accurately delineated from its picture in the water.

Johnson had a point. These are words that any speaker of English can press into new service at any time, on any occasion, alone or in combination, inventively or not, with hopes of being understood. In every revision, the OED’s entry for a word like make subdivides further and thus grows larger. The task is unbounded in an inward-facing direction.

The more obvious kind of unboundedness appears at the edges. Neologism never ceases. Words are coined by committee: transistor, Bell Laboratories, 1948. Or by wags: booboisie, H. L. Mencken, 1922. Most arise through spontaneous generation, organisms appearing in a petri dish, like blog (c. 1999). One batch of arrivals includes agroterrorism, bada-bing, bahookie (a body part), beer pong (a drinking game), bippy (as in, you bet your ———), chucklesome, cypherpunk, tuneage, and wonky. None are what Cawdrey would have seen as “hard, usual words,” and none are anywhere near Murray’s well-defined center, but they now belong to the common language. Even bada-bing: “Suggesting something happening suddenly, emphatically, or easily and predictably; ‘Just like that!’, ‘Presto!’ ” The historical citations begin with a 1965 audio recording of a comedy routine by Pat Cooper and continue with newspaper clippings, a television news transcript, and a line of dialogue from the first Godfather movie: “You’ve gotta get up close like this and bada-bing! you blow their brains all over your nice Ivy League suit.” The lexicographers also provide an etymology, an exquisite piece of guesswork: “Origin uncertain. Perh. imitative of the sound of a drum roll and cymbal clash. Perh. cf. Italian bada bene mark well.”

The English language no longer has such a thing as a geographic center, if it ever did. The universe of human discourse always has backwaters. The language spoken in one valley diverges from the language of the next valley, and so on. There are more valleys now than ever, even if the valleys are not so isolated. “We are listening to the language,” said Peter Gilliver, an OED lexicographer and resident historian. “When you are listening to the language by collecting pieces of paper, that’s fine, but now it’s as if we can hear everything said anywhere. Take an expatriate community living in a non-English-speaking part of the world, expatriates who live at Buenos Aires or something. Their English, the English that they speak to one another every day, is full of borrowings from local Spanish. And so they would regard those words as part of their idiolect, their personal vocabulary.” Only now they may also speak in chat rooms and on blogs. When they coin a word, anyone may hear. Then it may or may not become part of the language.

If there is an ultimate limit to the sensitivity of lexicographers’ ears, no one has yet found it. Spontaneous coinages can have an audience of one. They can be as ephemeral as atomic particles in a bubble chamber. But many neologisms require a level of shared cultural knowledge. Perhaps bada-bing would not truly have become part of twenty-first-century English had it not been for the common experience of viewers of a particular American television program (though it is not cited by the OED).

The whole word hoard—the lexis—constitutes a symbol set of the language. It is the fundamental symbol set, in one way: words are the first units of meaning any language recognizes. They are recognized universally. But in another way it is far from fundamental: as communication evolves, messages in a language can be broken down and composed and transmitted in much smaller sets of symbols: the alphabet; dots and dashes; drumbeats high and low. These symbol sets are discrete. The lexis is not. It is messier. It keeps on growing. Lexicography turns out to be a science poorly suited to exact measurement. English, the largest and most widely shared language, can be said very roughly to possess a number of units of meaning that approaches a million. Linguists have no special yardsticks of their own; when they try to quantify the pace of neologism, they tend to look to the dictionary for guidance, and even the best dictionary runs from that responsibility. The edges always blur. A clear line cannot be drawn between word and unword.

So we count as we can. Robert Cawdrey’s little book, making no pretense to completeness, contained a vocabulary of only 2,500. We possess now a more complete dictionary of English as it was circa 1600: the subset of the OED comprising words then current. That vocabulary numbers 60,000 and keeps growing, because the discovery of sixteenth-century sources never ends. Even so, it is a tiny fraction of the words used four centuries later. The explanation for this explosive growth, from 60,000 to a million, is not simple. Much of what now needs naming did not yet exist, of course. And much of what existed was not recognized. There was no call for transistor in 1600, nor nanobacterium, nor webcam, nor fen-phen. Some of the growth comes from mitosis. The guitar divides into the electric and the acoustic; other words divide in reflection of delicate nuances (as of March 2007 the OED assigned a new entry to prevert as a form of pervert, taking the view that prevert was not just an error but a deliberately humorous effect). Other new words appear without any corresponding innovation in the world of real things. They crystallize in the solvent of universal information.

What, in the world, is a mondegreen? It is a misheard lyric, as when, for example, the Christian hymn is heard as “Lead on, O kinky turtle . . .”). In sifting the evidence, the OED first cites a 1954 essay in Harper’s Magazine by Sylvia Wright: “What I shall hereafter call mondegreens, since no one else has thought up a word for them.” She explained the idea and the word this way:

When I was a child, my mother used to read aloud to me from Percy’s Reliques, and one of my favorite poems began, as I remember:

Ye Highlands and ye Lowlands,

Oh, where hae ye been?

They hae slain the Earl Amurray,

And Lady Mondegreen.

There the word lay, for some time. A quarter-century later, William Safire discussed the word in a column about language in The New York Times Magazine. Fifteen years after that, Steven Pinker, in his book The Language Instinct, offered a brace of examples, from “A girl with colitis goes by” to “Gladly the cross-eyed bear,” and observed, “The interesting thing about mondegreens is that the mishearings are generally less plausible than the intended lyrics.” But it was not books or magazines that gave the word its life; it was Internet sites, compiling mondegreens by the thousands. The OED recognized the word in June 2004.

A mondegreen is not a transistor, inherently modern. Its modernity is harder to explain. The ingredients—songs, words, and imperfect understanding—are all as old as civilization. Yet for mondegreens to arise in the culture, and for mondegreen to exist in the lexis, required something new: a modern level of linguistic self-consciousness and interconnectedness. People needed to mishear lyrics not just once, not just several times, but often enough to become aware of the mishearing as a thing worth discussing. They needed to have other such people with whom to share the recognition. Until the most modern times, mondegreens, like countless other cultural or psychological phenomena, simply did not need to be named. Songs themselves were not so common; not heard, anyway, on elevators and mobile phones. The word lyrics, meaning the words of a song, did not exist until the nineteenth century. The conditions for mondegreens took a long time to ripen. Similarly, the verb to gaslight now means “to manipulate a person by psychological means into questioning his or her own sanity”; it exists only because enough people saw the 1944 film of that title and could assume that their listeners had seen it, too. Might not the language Cawdrey spoke—which was, after all, the abounding and fertile language of Shakespeare—have found use for such a word? No matter: the technology for gaslight had not been invented. Nor had the technology for motion pictures.

The lexis is a measure of shared experience, which comes from inter-connectedness. The number of users of the language forms only the first part of the equation: jumping in four centuries from 5 million English speakers to a billion. The driving factor is the number of connections between and among those speakers. A mathematician might say that messaging grows not geometrically, but combinatorially, which is much, much faster. “I think of it as a saucepan under which the temperature has been turned up,” Gilliver said. “Any word, because of the interconnectedness of the English-speaking world, can spring from the backwater. And they are still backwaters, but they have this instant connection to ordinary, everyday discourse.” Like the printing press, the telegraph, and the telephone before it, the Internet is transforming the language simply by transmitting information differently. What makes cyberspace different from all previous information technologies is its intermixing of scales from the largest to the smallest without prejudice, broadcasting to the millions, narrowcasting to groups, instant messaging one to one.

This comes as quite an unexpected consequence of the invention of computing machinery. At first, that had seemed to be about numbers.

Chapter Four

To Throw the Powers of Thought into Wheel-Work

(Lo, the Raptured Arithmetician)

Light almost solar has been extracted from the refuse of fish; fire has been sifted by the lamp of Davy; and machinery has been taught arithmetic instead of poetry.

—Charles Babbage (1832)

NO ONE DOUBTED THAT Charles Babbage was brilliant. Nor did anyone quite understand the nature of his genius, which remained out of focus for a long time. What did he hope to achieve? For that matter, what, exactly, was his vocation? On his death in London in 1871 the Times obituarist declared him “one of the most active and original of original thinkers” but seemed to feel he was best known for his long, cranky crusade against street musicians and organ-grinders. He might not have minded. He was multifarious and took pride in it. “He showed great desire to inquire into the causes of things that astonish childish minds,” said an American eulogist. “He eviscerated toys to ascertain their manner of working.” Babbage did not quite belong in his time, which called itself the Steam Age or the Machine Age. He did revel in the uses of steam and machinery and considered himself a thoroughly modern man, but he also pursued an assortment of hobbies and obsessions—cipher cracking, lock picking, lighthouses, tree rings, the post—whose logic became clearer a century later. Examining the economics of the mail, he pursued a counterintuitive insight, that the significant cost comes not from the physical transport of paper packets but from their “verification”—the calculation of distances and the collection of correct fees—and thus he invented the modern idea of standardized postal rates. He loved boating, by which he meant not “the manual labor of rowing but the more intellectual art of sailing.” He was a train buff. He devised a railroad recording device that used inking pens to trace curves on sheets of paper a thousand feet long: a combination seismograph and speedometer, inscribing the history of a train’s velocity and all the bumps and shakes along the way.

As a young man, stopping at an inn in the north of England, he was amused to hear that his fellow travelers had been debating his trade:

“The tall gentleman in the corner,” said my informant, “maintained you were in the hardware line; whilst the fat gentleman who sat next to you at supper was quite sure that you were in the spirit trade. Another of the party declared that they were both mistaken: he said you were travelling for a great iron-master.”

“Well,” said I, “you, I presume, knew my vocation better than our friends.”

“Yes,” said my informant, “I knew perfectly well that you were in the Nottingham lace trade.”

He might have been described as a professional mathematician, yet here he was touring the country’s workshops and manufactories, trying to discover the state of the art in machine tools. He noted, “Those who enjoy leisure can scarcely find a more interesting and instructive pursuit than the examination of the workshops of their own country, which contain within them a rich mine of knowledge, too generally neglected by the wealthier classes.” He himself neglected no vein of knowledge. He did become expert on the manufacture of Nottingham lace; also the use of gunpowder in quarrying limestone; precision glass cutting with diamonds; and all known uses of machinery to produce power, save time, and communicate signals. He analyzed hydraulic presses, air pumps, gas meters, and screw cutters. By the end of his tour he knew as much as anyone in England about the making of pins. His knowledge was practical and methodical. He estimated that a pound of pins required the work of ten men and women for at least seven and a half hours, drawing wire, straightening wire, pointing the wire, twisting and cutting heads from the spiral coils, tinning or whitening, and finally papering. He computed the cost of each phase in millionths of a penny. And he noted that this process, when finally perfected, had reached its last days: an American had invented an automatic machine to accomplish the same task, faster.

Babbage invented his own machine, a great, gleaming engine of brass and pewter, comprising thousands of cranks and rotors, cogs and gearwheels, all tooled with the utmost precision. He spent his long life improving it, first in one and then in another incarnation, but all, mainly, in his mind. It never came to fruition anywhere else. It thus occupies an extreme and peculiar place in the annals of invention: a failure, and also one of humanity’s grandest intellectual achievements. It failed on a colossal scale, as a scientific-industrial project “at the expense of the nation, to be held as national property,” financed by the Treasury for almost twenty years, beginning in 1823 with a Parliamentary appropriation of £1,500 and ending in 1842, when the prime minister shut it down. Later, Babbage’s engine was forgotten. It vanished from the lineage of invention. Later still, however, it was rediscovered, and it became influential in retrospect, to shine as a beacon from the past.

Like the looms, forges, naileries, and glassworks he studied in his travels across northern England, Babbage’s machine was designed to manufacture vast quantities of a certain commodity. The commodity was numbers. The engine opened a channel from the corporeal world of matter to a world of pure abstraction. The engine consumed no raw materials—input and output being weightless—but needed a considerable force to turn the gears. All that wheel-work would fill a room and weigh several tons. Producing numbers, as Babbage conceived it, required a degree of mechanical complexity at the very limit of available technology. Pins were easy, compared with numbers.

It was not natural to think of numbers as a manufactured commodity. They existed in the mind, or in ideal abstraction, in their perfect infinitude. No machine could add to the world’s supply. The numbers produced by Babbage’s engine were meant to be those with significance: numbers with a meaning. For example, 2.096910013 has a meaning, as the logarithm of 125. (Whether every number has a meaning would be a conundrum for the next century.) The meaning of a number could be expressed as a relationship to other numbers, or as the answer to a certain question of arithmetic. Babbage himself did not speak in terms of meaning; he tried to explain his engine pragmatically, in terms of putting numbers into the machine and seeing other numbers come out, or, a bit more fancifully, in terms of posing questions to the machine and expecting an answer. Either way, he had trouble getting the point across. He grumbled:

On two occasions I have been asked,—“Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?” In one case a member of the Upper, and in the other a member of the Lower, House put this question. I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

Anyway, the machine was not meant to be a sort of oracle, to be consulted by individuals who would travel from far and wide for mathematical answers. The engine’s chief mission was to print out numbers en masse. For portability, the facts of arithmetic could be expressed in tables and bound in books.

To Babbage the world seemed made of such facts. They were the “constants of Nature and Art.” He collected them everywhere. He compiled a Table of Constants of the Class Mammalia: wherever he went he timed the breaths and heartbeats of pigs and cows. He invented a statistical methodology with tables of life expectancy for the somewhat shady business of life insurance. He drew up a table of the weight in Troy grains per square yard of various fabrics: cambric, calico, nankeen, muslins, silk gauze, and “caterpillar veils.” Another table revealed the relative frequencies of all the double-letter combinations in English, French, Italian, German, and Latin. He researched, computed, and published a Table of the Relative Frequency of the Causes of Breaking of Plate Glass Windows, distinguishing 464 different causes, no less than fourteen of which involved “drunken men, women, or boys.” But the tables closest to his heart were the purest: tables of numbers and only numbers, marching neatly across and down the pages in stately rows and columns, patterns for abstract appreciation.

A book of numbers: amid all the species of information technology, how peculiar and powerful an object this is. “Lo! the raptured arithmetician!” wrote Élie de Joncourt in 1762. “Easily satisfied, he asks no Brussels lace, nor a coach and six.” Joncourt’s own contribution was a small quarto volume registering the first 19,999 triangular numbers. It was a treasure box of exactitude, perfection, and close reckoning. These numbers were so simple, just the sums of the first n whole numbers: 1, 3 (1+2), 6 (1+2+3), 10 (1+2+3+4), 15, 21, 28, and so on. They had interested number theorists since Pythagoras. They offered little in the way of utility, but Joncourt rhapsodized about his pleasure in compiling them and Babbage quoted him with heartfelt sympathy: “Numbers have many charms, unseen by vulgar eyes, and only discovered to the unwearied and respectful sons of Art. Sweet joy may arise from such contemplations.”

Tables of numbers had been part of the book business even before the beginning of the print era. Working in Baghdad in the ninth century, Abu Abdullah Mohammad Ibn Musa al-Khwarizmi, whose name survives in the word algorithm, devised tables of trigonometric functions that spread west across Europe and east to China, made by hand and copied by hand, for hundreds of years. Printing brought number tables into their own: they were a natural first application for the mass production of data in the raw. For people in need of arithmetic, multiplication tables covered more and more territory: 10 × 1,000, then 10 × 10,000, and later as far as 1,000 × 1,000. There were tables of squares and cubes, roots and reciprocals. An early form of table was the ephemeris or almanac, listing positions of the sun, moon, and planets for sky-gazers. Tradespeople found uses for number books. In 1582 Simon Stevin produced Tafelen van Interest, a compendium of interest tables for bankers and moneylenders. He promoted the new decimal arithmetic “to astrologers, land-measurers, measurers of tapestry and wine casks and stereometricians, in general, mint masters and merchants all.” He might have added sailors. When Christopher Columbus set off for the Indies, he carried as an aid to navigation a book of tables by Regiomontanus printed in Nuremberg two decades after the invention of moveable type in Europe.

Joncourt’s book of triangular numbers was purer than any of these— which is also to say useless. Any arbitrary triangular number can be found (or made) by an algorithm: multiply n by n + 1 and divide by 2. So Joncourt’s whole compendium, as a bundle of information to be stored and transmitted, collapses in a puff to a one-line formula. The formula contains all the information. With it, anyone capable of simple multiplication (not many were) could generate any triangular number on demand. Joncourt knew this. Still he and his publisher, M. Husson, at the Hague, found it worthwhile to set the tables in metal type, three pairs of columns to a page, each pair listing thirty natural numbers alongside their corresponding triangular numbers, from 1(1) to 19,999(199,990,000), every numeral chosen individually by the compositor from his cases of metal type and lined up in a galley frame and wedged into an iron chase to be placed upon the press.

Why? Besides the obsession and the ebullience, the creators of number tables had a sense of their economic worth. Consciously or not, they reckoned the price of these special data by weighing the difficulty of computing them versus looking them up in a book. Precomputation plus data storage plus data transmission usually came out cheaper than ad hoc computation. “Computers” and “calculators” existed: they were people with special skills, and all in all, computing was costly.

Beginning in 1767, England’s Board of Longitude ordered published a yearly Nautical Almanac, with position tables for the sun, moon, stars, planets, and moons of Jupiter. Over the next half century a network of computers did the work—thirty-four men and one woman, Mary Edwards of Ludlow, Shropshire, all working from their homes. Their painstaking labor paid £70 a year. Computing was a cottage industry. Some mathematical sense was required but no particular genius; rules were laid out in steps for each type of calculation. In any case the computers, being human, made errors, so the same work was often farmed out twice for the sake of redundancy. (Unfortunately, being human, computers were sometimes caught saving themselves labor by copying from one other.) To manage the information flow the project employed a Comparer of the Ephemeris and Corrector of the Proofs. Communication between the computers and comparer went by post, men on foot or on horseback, a few days per message.

A seventeenth-century invention had catalyzed the whole enterprise. This invention was itself a species of number, given the name logarithm. It was number as tool. Henry Briggs explained:

Logarithmes are Numbers invented for the more easie working of questions in Arithmetike and Geometrie. The name is derived of Logos, which signifies Reason, and Arithmos, signifying Numbers. By them all troublesome Multiplications and Divisions in Arithmetike are avoided, and performed onely by Addition in stead of Multiplication, and by Subtraction in stead of Division.

In 1614 Briggs was a professor of geometry—the first professor of geometry—at Gresham College, London, later to be the birthplace of the Royal Society. Without logarithms he had already created two books of tables, A Table to find the Height of the Pole, the Magnetic Declination being given and Tables for the Improvement of Navigation, when a book came from Edinburgh promising to “take away all the difficultie that heretofore hath beene in mathematical calculations.”

There is nothing (right well beloved Students in the Mathematickes) that is so troublesome to Mathematicall practice, not that doth more molest and hinder Calculators, then the Multiplications, Divisions, square and cubical Extractions of great numbers, which besides the tedious expence of time, are for the most part subject to many slippery errors.

This new book proposed a method that would do away with most of the expense and the errors. It was like an electric flashlight sent to a lightless world. The author was a wealthy Scotsman, John Napier (or Napper, Nepair, Naper, or Neper), the eighth laird of Merchiston Castle, a theologian and well-known astrologer who also made a hobby of mathematics. Briggs was agog. “Naper, lord of Markinston, hath set my head and hands a work,” he wrote. “I hope to see him this summer, if it please God, for I never saw book, which pleased me better, and made me more wonder.” He made his pilgrimage to Scotland and their first meeting, as he reported later, began with a quarter hour of silence: “spent, each beholding other almost with admiration before one word was spoke.”

Briggs broke the trance: “My Lord, I have undertaken this long journey purposely to see your person, and to know by what engine of wit or ingenuity you came first to think of this most excellent help unto astronomy, viz. the Logarithms; but, my Lord, being by you found out, I wonder nobody else found it out before, when now known it is so easy.” He stayed with the laird for several weeks, studying.

In modern terms a logarithm is an exponent. A student learns that the logarithm of 100, using 10 as the base, is 2, because 100 = 10

. The logarithm of 1,000,000 is 6, because 6 is the exponent in the expression 1,000,000 = 10

. To multiply two numbers, a calculator could just look up their logarithms and add those. For example:

100 × 1,000,000 = 10

× 10

= 10

Looking up and adding are easier than multiplying.

But Napier did not express his idea this way, in terms of exponents. He grasped the thing viscerally: he was thinking in terms of a relationship between differences and ratios. A series of numbers with a fixed difference is an arithmetic progression: 0, 1, 2, 3, 4, 5 . . . When the numbers are separated by a fixed ratio, the progression is geometric: 1, 2, 4, 8, 16, 32 . . . Set these progressions side by side,

0 1 2 3 4 5 . . . (base 2 logarithms)

1 2 4 8 16 32 . . . (natural numbers)

and the result is a crude table of logarithms—crude, because the whole-number exponents are the easy ones. A useful table of logarithms had to fill in the gaps, with many decimal places of accuracy.

In Napier’s mind was an analogy: differences are to ratios as addition is to multiplication. His thinking crossed over from one plane to another, from spatial relationships to pure numbers. Aligning these scales side by side, he gave a calculator a practical means of converting multiplication into addition—downshifting, in effect, from the difficult task to the easier one. In a way, the method is a kind of translation, or encoding. The natural numbers are encoded as logarithms. The calculator looks them up in a table, the code book. In this new language, calculation is easy: addition instead of multiplication, or multiplication instead of exponentiation. When the work is done, the result is translated back into the language of natural numbers. Napier, of course, could not think in terms of encoding.

Briggs revised and extended the necessary number sequences and published a book of his own, Logarithmicall Arithmetike, full of pragmatic applications. Besides the logarithms he presented tables of latitude of the sun’s declination year by year; showed how to find the distance between any two places, given their latitudes and longitudes; and laid out a star guide with declinations, distance to the pole, and right ascension. Some of this represented knowledge never compiled and some was oral knowledge making the transition to print, as could be seen in the not-quite-formal names of the stars: the Pole Starre, girdle of Andromeda, Whales Bellie, the brightest in the harpe, and the first in the great Beares taile next her rump. Briggs also considered matters of finance, offering rules for computing with interest, backward and forward in time. The new technology was a watershed: “It may be here also noted that the use of a 100 pound for a day at the rate of 8, 9, 10, or the like for a yeare hath beene scarcely known, till by Logarithms it was found out: for otherwise it requires so many laborious extractions of roots, as will cost more paines than the knowledge of the thing is accompted to be worth.” Knowledge has a value and a discovery cost, each to be counted and weighed.

Even this exciting discovery took several years to travel as far as Johannes Kepler, who employed it in perfecting his celestial tables in 1627, based on the laboriously acquired data of Tycho Brahe. “A Scottish baron has appeared on the scene (his name I have forgotten) who has done an excellent thing,” Kepler wrote a friend, “transforming all multiplication and division into addition and subtraction.” Kepler’s tables were far more accurate—perhaps thirty times more—than any of his medieval predecessors, and the accuracy made possible an entirely new thing, his harmonious heliocentric system, with planets orbiting the sun in ellipses. From that time until the arrival of electronic machines, the majority of human computation was performed by means of logarithms. A teacher of Kepler’s sniffed, “It is not fitting for a professor of mathematics to manifest childish joy just because reckoning is made easier.” But why not? Across the centuries they all felt that joy in reckoning: Napier and Briggs, Kepler and Babbage, making their lists, building their towers of ratio and proportion, perfecting their mechanisms for transforming numbers into numbers. And then the world’s commerce validated their pleasure.

Charles Babbage was born on Boxing Day 1791, near the end of the century that began with Newton. His home was on the south side of the River Thames in Walworth, Surrey, still a rural hamlet, though the London Bridge was scarcely a half hour’s walk even for a small boy. He was the son of a banker, who was himself the son and grandson of goldsmiths. In the London of Babbage’s childhood, the Machine Age made itself felt everywhere. A new breed of impresario was showing off machinery in exhibitions. The shows that drew the biggest crowds featured automata—mechanical dolls, ingenious and delicate, with wheels and pinions mimicking life itself. Charles Babbage went with his mother to John Merlin’s Mechanical Museum in Hanover Square, full of clockwork and music boxes and, most interesting, simulacra of living things. A metal swan bent its neck to catch a metal fish, moved by hidden motors and cams. In the artist’s attic workshop Charles saw a pair of naked dancing women, gliding and bowing, crafted in silver at one-fifth life size. Merlin himself, their elderly creator, said he had devoted years to these machines, his favorites, still unfinished. One of the figurines especially impressed Charles with its (or her) grace and seeming liveliness. “This lady attitudinized in a most fascinating manner,” he recalled. “Her eyes were full of imagination, and irresistible.” Indeed, when he was a man in his forties he found Merlin’s silver dancer at an auction, bought it for £35, installed it on a pedestal in his home, and dressed its nude form in custom finery.

The boy also loved mathematics—an interest far removed from the mechanical arts, as it seemed. He taught himself in bits and pieces from such books as he could find. In 1810 he entered Trinity College, Cambridge—Isaac Newton’s domain and still the moral center of mathematics in England. Babbage was immediately disappointed: he discovered that he already knew more of the modern subject than his tutors, and the further knowledge he sought was not to be found there, maybe not anywhere in England. He began to acquire foreign books— especially books from Napoleon’s France, with which England was at war. From a specialty bookseller in London he got Lagrange’s Théorie des fonctions analytiques and “the great work of Lacroix, on the Differential and Integral Calculus.”

He was right: at Cambridge mathematics was stagnating. A century earlier Newton had been only the second professor of mathematics the university ever had; all the subject’s power and prestige came from his legacy. Now his great shadow lay across English mathematics as a curse. The most advanced students learned his brilliant and esoteric “fluxions” and the geometrical proofs of his Principia. In the hands of anyone but Newton, the old methods of geometry brought little but frustration. His peculiar formulations of the calculus did his heirs little good. They were increasingly isolated. The English professoriate “regarded any attempt at innovation as a sin against the memory of Newton,” one nineteenth-century mathematician said. For the running river of modern mathematics a student had to look elsewhere, to the Continent, to “analysis” and the language of differentiation as invented by Newton’s rival and nemesis, Gottfried Wilhelm Leibniz. Fundamentally, there was only one calculus. Newton and Leibniz knew how similar their work was— enough that each accused the other of plagiarism. But they had devised incompatible systems of notation—different languages—and in practice these surface differences mattered more than the underlying sameness. Symbols and operators were what a mathematician had to work with, after all. Babbage, unlike most students, made himself fluent in both— “the dots of Newton, the d’s of Leibnitz”—and felt he had seen the light. “It is always difficult to think and reason in a new language.”

Indeed, language itself struck him as a fit subject for philosophical study—a subject into which he found himself sidetracked from time to time. Thinking about language, while thinking in language, leads to puzzles and paradoxes. Babbage tried for a while to invent, or construct, a universal language, a symbol system that would be free of local idiosyncrasies and imperfections. He was not the first to try. Leibniz himself had claimed to be on the verge of a characteristica universalis that would give humanity “a new kind of an instrument increasing the powers of reason far more than any optical instrument has ever aided the power of vision.” As philosophers came face to face with the multiplicity of the world’s dialects, they so often saw language not as a perfect vessel for truth but as a leaky sieve. Confusion about the meanings of words led to contradictions. Ambiguities and false metaphors were surely not inherent in the nature of things, but arose from a poor choice of signs. If only one could find a proper mental technology, a true philosophical language! Its symbols, properly chosen, must be universal, transparent, and immutable, Babbage argued. Working systematically, he managed to create a grammar and began to write down a lexicon but ran aground on a problem of storage and retrieval—stopped “by the apparent impossibility of arranging signs in any consecutive order, so as to find, as in a dictionary, the meaning of each when wanted.” Nevertheless he felt that language was a thing a person could invent. Ideally, language should be rationalized, made predictable and mechanical. The gears should mesh.

Still an undergraduate, he aimed at a new revival of English mathematics—a suitable cause for founding an advocacy group and launching a crusade. He joined with two other promising students, John Herschel and George Peacock, to form what they named the Analytical Society, “for the propagation of d ’s” and against “the heresy of dots,” or as Babbage said, “the Dot-age of the University.” (He was pleased with his own “wicked pun.”) In their campaign to free the calculus from English dotage, Babbage lamented “the cloud of dispute and national acrimony, which has been thrown over its origin.” Never mind if it seemed French. He declared, “We have now to re-import the exotic, with nearly a century of foreign improvement, and to render it once more indigenous among us.” They were rebels against Newton in the heart of Newton-land. They met over breakfast every Sunday after chapel.

“Of course we were much ridiculed by the Dons,” Babbage recalled. “It was darkly hinted that we were young infidels, and that no good would come of us.” Yet their evangelism worked: the new methods spread from the bottom up, students learning faster than their teachers. “The brows of many a Cambridge moderator were elevated, half in ire, half in admiration, at the unusual answers which began to appear in examination papers,” wrote Herschel. The dots of Newton faded from the scene, his fluxions replaced by the notation and language of Leibniz.

Meanwhile Babbage never lacked companions with whom he could quaff wine or play whist for six-penny points. With one set of friends he formed a Ghost Club, dedicated to collecting evidence for and against occult spirits. With another set he founded a club called the Extractors, meant to sort out issues of sanity and insanity according to a set of procedures:

1 Every member shall communicate his address to the Secretary once in six months.

2 If this communication is delayed beyond twelve months, it shall be taken for granted that his relatives had shut him up as insane.

3 Every effort legal and illegal shall be made to get him out of the madhouse [hence the name “Extractors”].

4 Every candidate for admission as a member shall produce six certificates. Three that he is sane and three others that he is insane.

But the Analytical Society was serious. It was with no irony, all earnestness, that these mathematical friends, Babbage and Herschel and Peacock, resolved to “do their best to leave the world a wiser place than they found it.” They rented rooms and read papers to one another and published their “Transactions.” And in those rooms, as Babbage nodded over a book of logarithms, one of them interrupted: “Well, Babbage, what are you dreaming about?”

“I am thinking that all these Tables might be calculated by machinery,” he replied.