The First Word: The Search for the Origins of Language (31 page)

BOOK: The First Word: The Search for the Origins of Language
10.47Mb size Format: txt, pdf, ePub

Is it possible that even though some of these accounts appear mutually exclusive, the researchers are actually describing different stages of a cohesive evolutionary narrative? Yes. It’s likely that different parts from many theories will survive in a grand synthesis because within this vast time frame, numerous evolutionary pressures had some effect. Given the way the recent accumulation of data about how the brain works and genetic influences on language have forced researchers to constrain their theories accordingly, a more widespread agreement isn’t out of the question in the near future.

At any rate, the question of which specific evolutionary pressures were in play at which moment in time is a less prominent consideration in the field. There is less concern about why language came to be because there is so much to say about
what
came to be and
how
it came to be—which gene changed, which behavior is ancient, and which ability is new? At this point, we must be content to survey all possible answers and acknowledge that in the last six million years many of them probably played a role. The stories can be especially helpful as spurs to testable hypotheses.

 

 

 

In addition to examining the specific pressures, incidents, and abilities that have contributed to the story of language evolution, it is also important to look at co-evolution: the way that human language and the human genome have shaped each other. Co-evolution is the least explored aspect of the mystery. For all the difficulty and challenge of tracing language evolution, working out how species and language arise over time and then provide feedback to each other is probably the hardest part.

Terrence Deacon has grappled with the issue of co-evolution, focusing on the back-and-forth between language and the brain. Recall that he proposed that the beginnings of language and symbol use can be found in the shift from australopithecines to hominids some two million years ago. What preceded this evolutionary shift was the use of flaked stone tools. Deacon argues that it was this tool use that spurred the evolution from one kind of primate to another and, in doing so, created an animal with a predisposition for even more symbol use.

This kind of change, which is called Baldwinian evolution, occurs when the behavior of an animal actually contributes to the environment in which its genetic evolution is shaped. Lactose tolerance is an example of Baldwinian evolution in humans. The ability to digest dairy products in adulthood is most common in groups of people who have been herding animals the longest. In this case, it’s a behavior—herding—not a climatic change or some other kind of environmental shift, that contributed to the selection pressures in which a predisposition for lactose tolerance improved reproductive success.

The australopithecine tool use helped to create a world where it was more and more useful to have the genetic predisposition underlying that behavior. The better an individual was at it naturally, the more likely he or she was to survive and have offspring, probably passing this trait on to them, and the more significant that behavior became in the world of the species. It wasn’t that our brains got bigger as a result of bipedalism or dietary changes or any other reason, thereby making us clever enough to invent stone tools; rather, we started to use stone tools that are slightly more complicated than the tools chimpanzees use even today, and as a result our brains got bigger.

The co-evolutionary story that began at this time and that continues to this day is one in which the Baldwinian interaction between culture and biology played a particularly significant role. Deacon points out that our brains did not get bigger in the australopithecine-hominid transition in the same way that the surface of an inflating balloon gets bigger all over. It was the forebrain, particularly the cerebellum and the cerebral cortex, that ballooned, while the rest of the brain followed the growth rate seen in other primate brains.
2
In order to unwind the ways that language and the brain have co-evolved, you have to look at the parts of the brain that got bigger, says Deacon, and you have to look at how they got bigger.

The prefrontal cortex corresponds to a small section of the developing brain in the human fetus. When the brain is nothing more than a neural tube, the part of the tube that later turns into the forebrain breaks out of the growth patterns that constrain the rest of the brain. This stretch of tube is controlled by the Otx and Emx genes. The developmental clock that signals to every part of the brain and body when to stop growing has been extended for the regions controlled by these genes. The significance of this altered growth pattern, according to Deacon, is not that human brains are faster and better computers; it means that the balance has been shifted in terms of
what kind of thinking
goes on in the brain. As a result, our learning skills are biased toward certain types of processing and not others.

Acquiring and deploying the particular kinds of connections and structural patterns that characterize language, says Deacon, pose some very unusual learning problems, and the kinds of learning processes that most mammal brains are specialized for are not well equipped to deal with these problems. However, the neural machine that results from the human combination of body and brain growth patterns is one that rather brilliantly performs the computations that underlie language learning.

The fact that language arises from dynamically interacting brain regions with their vastly different evolutionary histories (the more primitive and unchanged along with the parts that have shifted more recently) is another reason why we should not think of language, or even other mental abilities, such as mathematics, as monolithic things. Instead, argues Deacon, they arise out of a “delicate balance of many complementary and competing learning, perceiving and behavioral biases.”
3

Upending these assumptions about brain evolution leads us to a startling conclusion, says Deacon. One of the reasons we haven’t been able to work out how language and the brain co-evolved is because we have been asking the wrong question all along. From the beginning, researchers investigating the brain and language have assumed that the brain came first. The usual line of reasoning holds that the brain was selected for increased general intelligence and then it evolved language, which relies on that optimized intelligence. Actually, says Deacon, we should be looking at the effect of language on the brain, as well as the effect of the brain on language.

Generally, the amount of brain tissue devoted to particular types of processing is proportional to the amount of information being processed. Brain regions that serve seeing, smelling, and touching, for example, are matched sizewise to the amount of information that is filtered through our bodies in these senses. One of the crucial differences between the human brain and other mammal brains is that ours is larger overall relative to the body. This leaves a considerable proportion of the human brain, says Deacon, that is not processing information from the outside world in the way that the visual and auditory cortices are.

Even though they are not directly processing sights, sounds, and other senses, the unusually expanded prefrontal brain regions look as if they have been “deluged with some massive new set of…inputs.”
4
The larger brain region, says Deacon, is “an evolutionary response to a sort of virtual input with increased processing demands.”
5
That input, of course, is language.

In this view, language cannot ultimately be treated as a straightforward example of the capabilities of the brain, and we should not be asking “How did the brain evolve language?” Rather, we should ask, “How did language evolve the brain?” Language is the author of itself, says Deacon, and the brain is the smoking gun for language.

The result of the co-evolution of the human brain and language is that we now have an overall cognitive bias toward the “strange associative relationships of language.” In this sense our
whole
brain is shaped by language, and many of our cognitive processes are linguistic. What this means, according to Deacon, is that once we have adapted to language, we can’t not be language-creatures. For us, everything is symbolic.

Indeed, Deacon explains, the virtual world that we inhabit is as real, sometimes more real, than the physical world. Even the tendency to infer the hand of a designer when faced with complex design (whether it is a deity that has designed all of creation or a special language organ that generates human languages) arises from the fact that we are a symbolic species. Ironically, what makes it hard to discern how language evolved is a result of language having evolved. The worldwide web of words and rules that we inhabit is so vast, contracted, and dense, it’s hard to look in from the outside.

 

 

 

Arbib and Deacon seek to illuminate moments in the last six million (and more) years of human evolution. By comparison, the last ten thousand years is a blip. Nevertheless, it is an interesting period in the history of the human brain and also of language. There is some suggestion that our brains may have changed as little as ten thousand years ago, and in fact, become smaller. It will be some time before data on the trajectory of brain growth at this time are solid enough for us to be confident of this change. Generally, it is thought that within the last ten thousand years there has been no obvious anatomical change arising from the drift and selection of genes in our species. The same goes for language. Most language change in this time frame is associated not with obvious biological change in humans but with the movement of human populations and transformation of their lifestyles.

Jared Diamond and Peter Bellwood examined the effect on farming, which independently arose in human communities at least nine different times between 8500 and 2500
B.C.
6
The researchers demonstrate that the advantages of farming over hunter-gatherer lifestyles, including greater access to food, denser populations, and greater resistance to disease, spurred the spread of farming communities, and their culture and language with them. They propose essentially that prehistoric language and genes spread with prehistoric farming, and that tracking one will illuminate the ancient paths taken by the other.

There are many different types of clues to the prehistory of language, and their intersecting relationships are complicated.
7
Here the researcher interested in connecting the long and short arcs traced by language in time must master at least genetic, archaeological, paleoanthropological, linguistic, and geographic evidence. As more researchers engage with this multidimensional problem, we will see ever more clearly how a mental bias gave rise to a language, which became languages, and then rich and sprawling language families.

IV.
   
WHERE NEXT?
 
15.
The future of the debate
 

P
inker and Bloom’s 1990 paper caused a sea change in the attitude toward language evolution, and the early years of research that followed were a time of great exhilaration and puzzlement. The 1996 Evolution of Language conference, organized by Jim Hurford and Chris Knight, was the first in what became a series of biennial meetings for scholars from various disciplines and countries to come together to address this issue. The participants brought their biases and jargon with them, and there was less shared language and understanding than had been hoped. In the end, no synthesis was reached that would get everyone on the same page. In this early period, a great deal of energy was expended in simply justifying the research. As the years went by and more data and ideas accrued in the biennial conferences, and as other conferences also started up, certain questions and methods—those reviewed in parts 2 and 3—emerged as central.

Neither Pinker nor Chomsky said much on the topic in this period. In 2002, however, Chomsky appeared in a panel discussion at the Harvard Evolution of Language conference. Tecumseh Fitch was one of the conference organizers, and Marc Hauser sat on the stage next to Chomsky. Pinker was in the audience, although he, like Chomsky, had not attended other conference presentations. Chomsky suggested that language evolved separately from speech, because deaf children are still able to learn sign language, and he proposed that people use language more for talking to themselves than to talk with other people.

Later that year, Hauser, Chomsky, and Fitch published a paper in
Science
called “The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?” The point of the paper was to provide a framework for fruitful discussion and clear up confusion in the field. It argued that a lot of research vital to an understanding of language and linguistic evolution was typically ignored or dismissed by linguists, and it also advocated collaboration between researchers from different disciplines.

In an accompanying editorial, titled “Noam’s Ark,” linguists Thomas Bever and Mario Montalbetti wrote: “Language is naturally viewed as a unique feature of being human. Accordingly, the study of what language is—linguistics—has been very influential, primarily in the social and behavioral sciences…Hauser, Chomsky, and Fitch expand the scope of language study with their demonstration that complex behaviors in animals and non-linguistic behaviors in humans can inform our understanding of language evolution.”

The article inspired many impassioned responses, some as enthusiastic as Bever and Montalbetti’s. Others expressed shock and even rage. “The Faculty of Language: What Is It, Who Has It, and How Did It Evolve?” gave the impression, at least to some, that Chomsky had abandoned his old view of language and swapped sides in the great debate. Derek Bickerton, a longtime Chomskyan linguist, wrote:

 

Into the middle of this confused and confusing situation there appeared in the journal
Science
a paper…aimed at setting the scientific community straight with regard to language evolution. Its magisterial tone was surprising, considering how little work any of its authors had previously produced in the field, but no more surprising than the collaborators themselves: since Hauser was known as a strong continuist and Chomsky as a strong discontinuist, it was almost as if Ariel Sharon and Yasser Arafat had coauthored a position paper on the Middle East. In this paper, practically every aspect of the language faculty is treated as pre-existing the emergence of language, except for “narrow syntax” (whether this is the same as, or different from, the old “core syntax,” we are nowhere told), which consists solely of recursion. Even recursion is supposed to derive from some prior computational mechanism employed by antecedent species for navigation, social cognition or some other purpose as yet undetermined, and then exapted for syntax; researchers are adjured to start searching for such mechanisms.
1

 

The reaction to “The Faculty of Language” served as a catalyst in the same way the Pinker and Bloom paper did twelve years earlier. The perception of an allegiance to Chomsky was a lightning rod, although it meant different things to different people. There were two main camps of disagreement. Some critics thought the paper consisted of the same Chomskyan ideas of the last four decades, dressed up as something novel with animal data attached. Taking the completely opposite view, others were angered by what they saw as a retraction of ideas that Chomsky had spent years developing. Depending on their field, researchers suspected either that Chomsky had influenced Hauser and Fitch or that Hauser and Fitch had hijacked Chomsky.

 

 

 

In their
Science
paper, Hauser, Chomsky, and Fitch proposed a two-part model of language, based on a broad faculty of language and a narrow faculty. The broad faculty comprises the narrow faculty, in combination with two other systems. The first consists of the nerves, muscles, and organs that enable us to see, hear, and touch the world around us; it also includes the physical characteristics we use to create and interpret speech, such as the agility of our tongue, the position of our larynx, and our ability to interpret stress and pitch. The second system consists of a creature’s knowledge of the world and its capacity to use that knowledge to form intentions and act upon them. The authors called them the sensory-motor and the conceptual-intentional systems.

At a minimum, wrote the authors, the narrow faculty is a computational system that “includes the capacity of recursion.” Elsewhere, they described the key component of the narrow faculty as a recursive computational system that generates linguistic structure and maps it onto the two other systems. In this sense, the narrow faculty of language is an interface between recursive computational abilities, the body, and thought.

The authors then presented a distillation of opinion in the field in the form of three distinct hypotheses, using their terminology of a broad and a narrow faculty. In one hypothesis, all components of the broad faculty of language have homologs in other animals, so there is nothing in language that is unique to humans. In an alternate theory, the broad faculty is a uniquely human adaptation. So even if other animals have traits that appear similar to human traits used for language, such as social intelligence or toolmaking, they have been significantly refined in the human lineage and should be considered novel features, specific to humans.

In a third hypothesis of their own, the authors proposed that most of the broad faculty of language is shared with other species, and that any differences in the human and animal traits are quantitative rather than qualitative. They cited experiments conducted by themselves and others showing that animals understand the world in complicated ways. For instance, some birds use the sky and landmarks to help them navigate complex paths; other animals, such as monkeys, recognize and can use in varying degrees abstract ideas like color, number, and geometric relationships; many different species can use mirrors to locate objects, and chimps, bonobos, and orangutans even appear to recognize their own reflections; also, chimpanzees seem to infer from a person’s or a fellow chimp’s actions what that creature is thinking.

In contrast, the narrow faculty of language is a recent, uniquely human innovation. Hauser, Chomsky, and Fitch noted that even though the recursive mechanisms that underlie syntax may be unique to humans, they are not necessarily unique to language. Instead, this system could be a spandrel, having evolved for something other than communication and still used in nonlinguistic domains. Where did this capacity come from? Perhaps, they wrote, it was initially used for navigating social relationships and only later co-opted by language. They pointed out that because chimpanzees have highly complicated social systems, they must remember (without the help of language) who among them is dominant and who is not. Pre-linguistic humans may have faced similar challenges and solved them with mental recursion.

Certain ideas in the
Science
paper were familiar to anyone who followed Chomsky’s work. He was, of course, the first linguist to attach importance to the fact that human brains can take a set of entities, such as words, and create an infinitely long pattern with them, such as a sentence. As we now recognize, this makes human language limitless, and most important, this recursive mechanism allows us to express complicated thoughts. We’re not restricted to only one level of observation or knowledge; we can see (and say), “He knows,” but also, “She knows that he knows.” Each level of recursion is a step upward in complexity.

Moreover, Chomsky had previously suggested that the mechanism of recursion extended beyond language and was vital to human cognition more broadly. As the
Science
article pointed out, recursion is characteristic of the number system as well as the grammatical system. Just as “Mary thinks that” could be added to any sentence, “2x” could be added to any equation, no matter how long it already is.

In essence, the Hauser, Chomsky, and Fitch hypothesis said that although other animals may indeed have a rich understanding of the world, they have no way to convey it. It was only when humans connected their internal understandings with the means to express them that they gained their unique form of language. After the article was published, Hauser remarked, “When those things got married, the world was changed.”

 

 

 

Steven Pinker and Ray Jackendoff published a response to Hauser, Chomsky, and Fitch, and a vehement back-and-forth ensued. (In all, four papers, including the original
Science
article, were published.) Pinker and Jackendoff charged Chomsky with having abandoned the last twenty-five years of his research and co-opting ideas from models he had once completely dismissed.

“I think the thing that startled a lot of people about that
Science
paper,” said Jackendoff, “was that all of a sudden Chomsky seemed to be saying that language isn’t so complex after all—that all this complexity is coming from the interaction of this very simple system with the interfaces, and so to many linguists it was like Chomsky was undermining the position on which we had all grown up and many of us still believe. Pinker’s and my reply wasn’t so much about the evolution of language as the character of language. We wanted to say, ‘Look, there are all these complexities to language, and they don’t reduce out to general capacities found in other animals.’”

Pinker and Jackendoff argued that Chomsky and his co-writers implied that Chomsky’s linguistics was the only kind of linguistics there was, which in effect predetermined their definition of language. Throughout the paper, as throughout most of Chomsky’s writing, language is described as having a “core”—a small set of very important features that lie at the heart of the phenomenon. But, Pinker and Jackendoff argued, there is no core to language. The appearance of one is just a mirage, an artifact of the way Chomskyans carve up language in the first place. Language is a complicated mass that can’t be neatly reduced to a smaller concentrated essence or set of rules.

Pinker and Jackendoff also emphasized the idea of modifications taking place in organs and functions, in contrast to the Hauser, Chomsky, and Fitch hypothesis that traits could be assigned to one bin or another—broad and shared with many animals, or only human.

In an interview, Pinker later said:

 

I don’t think a theory of language evolution based on a theory of language that is idiosyncratic to one person’s vision is productive. I don’t think divorcing language from communication is a step forward, and I don’t think writing off everything but syntax, indeed everything but recursion, and giving it to the animals, is a step forward. I think Chomsky so badly wanted to save something as unique to humans, namely the core of syntax, that he was willing to sacrifice everything else, in particular, the parts of language he is less interested in, like speech and words. It reminds me of the lizard that lets its tail break off when a predator is about to attack.

 

Philip Lieberman took the opposite view of the paper. “It’s the same old Chomsky claim—a unique neural system or device specific to language exists in humans and humans alone, allowing infinite ‘recursion.’ It is a sea of words covering up Chomsky’s unchanged view concerning the essence of language—it is a capacity shared by no other animal and distinct from any other aspect of human behavior.”

For scholars like Lieberman, the authors’ proposal to use comparative data to explore the question of language evolution was disingenuous. As he explained, “The comparative method has been used for many years to explore the evolution of language—my first published paper comparing monkeys to humans was published in 1968.” Thus, rather than illuminate a way forward, the paper—for some of its critics—obscured the intellectual history of many of the studies it mentioned. Lieberman said, “The aspects of language that Hauser, Chomsky, and Fitch believe can be revealed through comparative behavioral and neurophysiologic studies are the ones that Chomsky and his disciples have always considered trivial and irrelevant.”
2

Similarly, William D. Hopkins, whose work with chimpanzees revealed Brodmann’s area 44, observed that even though Chomsky was finally incorporating animal data, he was using it to designate commonalities between humans and other animals as somehow not important to language. “I’m not sure what that is,” he said, “but it’s not the comparative method.”

As for recursion, Lieberman argued that it was adequately accounted for in the brain’s control of the motor system. Pinker and Jackendoff pointed out that recursion occurs not only in language but also in vision, thus providing little motivation to restrict it to a narrow faculty of language. Irene Pepperberg noted that as far as comprehension was concerned, recursion isn’t necessarily unique to humans. Still others raised the possibility that even humans don’t do recursion either very much or very effectively.

Other books

The atrocity exhibition by J. G. Ballard
The Heiress Bride by Catherine Coulter
Heart's Desire by Jacquie D'Alessandro
Trauma Queen by Barbara Dee
Chosen by the Bear by Imogen Taylor
The House at Baker Street by Michelle Birkby