Authors: Nicholas Carr
For centuries, historians and philosophers have traced, and debated, technology’s role in shaping civilization. Some have made the case for what the sociologist Thorstein Veblen dubbed “technological determinism” they’ve argued that technological progress, which they see as an autonomous force outside man’s control, has been the primary factor influencing the course of human history. Karl Marx gave voice to this view when he wrote, “The windmill gives you society with the feudal lord; the steam-mill, society with the industrial capitalist.”
9
Ralph Waldo Emerson put it more crisply: “Things are in the saddle/And ride mankind.”
10
In the most extreme expression of the determinist view, human beings become little more than “the sex organs of the machine world,” as McLuhan memorably wrote in the “Gadget Lover” chapter of
Understanding Media
.
11
Our essential role is to produce ever more sophisticated tools—to “fecundate” machines as bees fecundate plants—until technology has developed the capacity to reproduce itself on its own. At that point, we become dispensable.
At the other end of the spectrum are the instrumentalists—the people who, like David Sarnoff, downplay the power of technology, believing tools to be neutral artifacts, entirely subservient to the conscious wishes of their users. Our instruments are the means we use to achieve our ends; they have no ends of their own. Instrumentalism is the most widely held view of technology, not least because it’s the view we would prefer to be true. The idea that we’re somehow controlled by our tools is anathema to most people. “Technology is technology,” declared the media critic James Carey; “it is a means for communication and transportation over space, and nothing more.”
12
The debate between determinists and instrumentalists is an illuminating one. Both sides command strong arguments. If you look at a particular technology at a particular point in time, it certainly appears that, as the instrumentalists claim, our tools are firmly under our control. Every day, each of us makes conscious decisions about which tools we use and how we use them. Societies, too, make deliberate choices about how they deploy different technologies. The Japanese, looking to preserve the traditional samurai culture, effectively banned the use of firearms in their country for two centuries. Some religious communities, such as the Old Order Amish fellowships in North America, shun motor cars and other modern technologies. All countries put legal or other restrictions on the use of certain tools.
But if you take a broader historical or social view, the claims of the determinists gain credibility. Although individuals and communities may make very different decisions about which tools they use, that doesn’t mean that as a species we’ve had much control over the path or pace of technological progress. It strains belief to argue that we “chose” to use maps and clocks (as if we might have chosen not to). It’s even harder to accept that we “chose” the myriad side effects of those technologies, many of which, as we’ve seen, were entirely unanticipated when the technologies came into use. “If the experience of modern society shows us anything,” observes the political scientist Langdon Winner, “it is that technologies are not merely aids to human activity, but also powerful forces acting to reshape that activity and its meaning.”
13
Though we’re rarely conscious of the fact, many of the routines of our lives follow paths laid down by technologies that came into use long before we were born. It’s an overstatement to say that technology progresses autonomously—our adoption and use of tools are heavily influenced by economic, political, and demographic considerations—but it isn’t an overstatement to say that progress has its own logic, which is not always consistent with the intentions or wishes of the toolmakers and tool users. Sometimes our tools do what we tell them to. Other times, we adapt ourselves to our tools’ requirements.
The conflict between the determinists and the instrumentalists will never be resolved. It involves, after all, two radically different views of the nature and destiny of humankind. The debate is as much about faith as it is about reason. But there is one thing that determinists and instrumentalists can agree on: technological advances often mark turning points in history. New tools for hunting and farming brought changes in patterns of population growth, settlement, and labor. New modes of transport led to expansions and realignments of trade and commerce. New weaponry altered the balance of power between states. Other breakthroughs, in fields as various as medicine, metallurgy, and magnetism, changed the way people live in innumerable ways—and continue to do so today. In large measure, civilization has assumed its current form as a result of the technologies people have come to use.
What’s been harder to discern is the influence of technologies, particularly intellectual technologies, on the functioning of people’s brains. We can see the products of thought—works of art, scientific discoveries, symbols preserved on documents—but not the thought itself. There are plenty of fossilized bodies, but there are no fossilized minds. “Gladly would I unfold in calm degrees a natural history of the intellect,” wrote Emerson in 1841, “but what man has yet been able to mark the steps and boundaries of that transparent essence?”
14
Today, at last, the mists that have obscured the interplay between technology and the mind are beginning to lift. The recent discoveries about neuroplasticity make the essence of the intellect more visible, its steps and boundaries easier to mark. They tell us that the tools man has used to support or extend his nervous system—all those technologies that through history have influenced how we find, store, and interpret information, how we direct our attention and engage our senses, how we remember and how we forget—have shaped the physical structure and workings of the human mind. Their use has strengthened some neural circuits and weakened others, reinforced certain mental traits while leaving others to fade away. Neuroplasticity provides the missing link to our understanding of how informational media and other intellectual technologies have exerted their influence over the development of civilization and helped to guide, at a biological level, the history of human consciousness.
We know that the basic form of the human brain hasn’t changed much in the last forty thousand years.
15
Evolution at the genetic level proceeds with exquisite slowness, at least when gauged by man’s conception of time. But we also know that the ways human beings think and act have changed almost beyond recognition through those millennia. As H. G. Wells observed of mankind in his 1938 book
World Brain
, “His social life, his habits, have changed completely, have even undergone reversion and reversal, while his heredity seems to have changed very little if at all, since the late Stone Age.”
16
Our new knowledge of neuroplasticity untangles this conundrum. Between the intellectual and behavioral guardrails set by our genetic code, the road is wide, and we hold the steering wheel. Through what we do and how we do it—moment by moment, day by day, consciously or unconsciously—we alter the chemical flows in our synapses and change our brains. And when we hand down our habits of thought to our children, through the examples we set, the schooling we provide, and the media we use, we hand down as well the modifications in the structure of our brains.
Although the workings of our gray matter still lie beyond the reach of archaeologists’ tools, we now know not only that it is probable that the use of intellectual technologies shaped and reshaped the circuitry in our heads, but that it had to be so. Any repeated experience influences our synapses; the changes wrought by the recurring use of tools that extend or supplement our nervous systems should be particularly pronounced. And even though we can’t document, at a physical level, the changes in thinking that happened in the distant past, we can use proxies in the present. We see, for example, direct evidence of the ongoing process of mental regeneration and degeneration in the brain changes that occur when a blind person learns to read Braille. Braille, after all, is a technology, an informational medium.
Knowing what we do about London cabbies, we can posit that as people became more dependent on maps, rather than their own memories, in navigating their surroundings, they almost certainly experienced both anatomical and functional changes in the hippocampus and other brain areas involved in spatial modeling and memory. The circuitry devoted to maintaining representations of space likely shrank, while areas employed in deciphering complex and abstract visual information likely expanded or strengthened. We also now know that the changes in the brain spurred by map use could be deployed for other purposes, which helps explain how abstract thinking in general could be promoted by the spread of the cartographer’s craft.
The process of our mental and social adaptation to new intellectual technologies is reflected in, and reinforced by, the changing metaphors we use to portray and explain the workings of nature. Once maps had become common, people began to picture all sorts of natural and social relationships as cartographic, as a set of fixed, bounded arrangements in real or figurative space. We began to “map” our lives, our social spheres, even our ideas. Under the sway of the mechanical clock, people began thinking of their brains and their bodies—of the entire universe, in fact—as operating “like clockwork.” In the clock’s tightly interconnected gears, turning in accord with the laws of physics and forming a long and traceable chain of cause and effect, we found a mechanistic metaphor that seemed to explain the workings of all things, as well as the relations between them. God became the Great Clockmaker. His creation was no longer a mystery to be accepted. It was a puzzle to be worked out. Wrote Descartes in 1646, “Doubtless when the swallows come in spring, they operate like clocks.”
17
THE MAP AND
clock changed language indirectly, by suggesting new metaphors to describe natural phenomena. Other intellectual technologies change language more directly, and more deeply, by actually altering the way we speak and listen or read and write. They might enlarge or compress our vocabulary, modify the norms of diction or word order, or encourage either simpler or more complex syntax. Because language is, for human beings, the primary vessel of conscious thought, particularly higher forms of thought, the technologies that restructure language tend to exert the strongest influence over our intellectual lives. As the classical scholar Walter J. Ong put it, “Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word.”
18
The history of language is also a history of the mind.
Language itself is not a technology. It’s native to our species. Our brains and bodies have evolved to speak and to hear words. A child learns to talk without instruction, as a fledgling bird learns to fly. Because reading and writing have become so central to our identity and culture, it’s easy to assume that they, too, are innate talents. But they’re not. Reading and writing are unnatural acts, made possible by the purposeful development of the alphabet and many other technologies. Our minds have to be taught how to translate the symbolic characters we see into the language we understand. Reading and writing require schooling and practice, the deliberate shaping of the brain.
Evidence of this shaping process can be seen in many neurological studies. Experiments have revealed that the brains of the literate differ from the brains of the illiterate in many ways—not only in how they understand language but in how they process visual signals, how they reason, and how they form memories. “Learning how to read,” reports the Mexican psychologist Feggy Ostrosky-Solís, has been shown to “powerfully shape adult neuropsychological systems.”
19
Brain scans have also revealed that people whose written language uses logographic symbols, like the Chinese, develop a mental circuitry for reading that is considerably different from the circuitry found in people whose written language employs a phonetic alphabet. As Tufts University developmental psychologist Maryanne Wolf explains in her book on the neuroscience of reading,
Proust and the Squid
, “Although all reading makes use of some portions of the frontal and temporal lobes for planning and for analyzing sounds and meanings in words, logographic systems appear to activate very distinctive parts of [those] areas, particularly regions involved in motoric memory skills.”
20
Differences in brain activity have even been documented among readers of different alphabetic languages. Readers of English, for instance, have been found to draw more heavily on areas of the brain associated with deciphering visual shapes than do readers of Italian. The difference stems, it’s believed, from the fact that English words often look very different from the way they sound, whereas in Italian words tend to be spelled exactly as they’re spoken.
21
The earliest examples of reading and writing date back many thousands of years. As long ago as 8000 BC, people were using small clay tokens engraved with simple symbols to keep track of quantities of livestock and other goods. Interpreting even such rudimentary markings required the development of extensive new neural pathways in people’s brains, connecting the visual cortex with nearby sense-making areas of the brain. Modern studies show that the neural activity along these pathways doubles or triples when we look at meaningful symbols as opposed to meaningless doodles. As Wolf describes, “Our ancestors could read tokens because their brains were able to connect their basic visual regions to adjacent regions dedicated to more sophisticated visual and conceptual processing.”
22
Those connections, which people bequeathed to their children when they taught them to use the tokens, formed the basic wiring for reading.