Read Languages In the World Online
Authors: Julie Tetel Andresen,Phillip M. Carter
We have defined language as an orienting behavior that orients the orientee within his or her cognitive domain. When we think about deep time and the coevolution of the human brain and language, we speculate that the first linguistic orientation was likely that
of
orientation: the way that one person's verbalization, no doubt accompanied by a gesture and a touch, turned another person's attention and bodily movements in a particular direction.
Look this way, not that way
; or:
Follow me
; or:
The good game is over there.
We could also call this orientation
pointing the way
. Linguistic pointing is known by the term
deixis
, from the Greek adjective
deiktikos
meaning âpointing.'
Deixis indicates the position of people, objects, and events with respect to a particular point of reference. Deictic categories include personal pronouns (
I
,
you
,
he
,
she
, etc.), demonstrative pronouns (
this
,
that
), spatial adverbs (
here
,
there
), and temporal adverbs (
today
,
tomorrow
,
yesterday
). How a language divides up the semantic space of the personal pronouns, for instance, can sometimes place this language in a long lineage of languages making similar distinctions. Whether or not a language has a deictic category, say, that of tense, can be similarly indicative of lineage inclusion or exclusion.
1
When it comes to the languages of the world, aspect is a more frequently occurring distinction than is tense. It also seems to be the case that children master aspect before tense in languages that have both categories (Andrews 2012:222). Possible measures of what counts as basic in language might be categories that most languages have and/or what children around the world are likely to pick up on more readily. The presence or absence of such a basic and widespread category as aspect may or may not be indicative of lineage inclusion or exclusion.
In this chapter, we are interested in getting down to the basics, to the bottom of things linguistic. That is, we will do so in so far as we can for such a complex sociocultural product as language, which has only left traces of itself since recorded history. We will investigate what psycholinguists, evolutionary biologists, population geneticists, and deep-time historical linguists are able to tell us about linguistic states of affairs in the great blank space stretching from the chimpâhuman lineage split about six million years ago up until 40 kya when the lineages that eventually split into stocks were taking shape. We will paint what we can of the picture of how
Homo sapiens sapiens
are also
Homo loquens
.
We opened Chapter 2 with a discussion of the differing frames of reference to describe spatial relations on the horizontal plane, and we chose this particular topic to orient you from the beginning to the differing orienting possibilities afforded through different languages. We return to the topic here in order to observe that, while many species have excellent navigating skills (bats, migratory birds, bees), humans cannot be counted among them, and neither can our primate cousins who tend to be a sedentary lot. Humans have long had to rely on language and culture â and increasingly accurate navigational technology â to get from Point A to Point Z.
Languages with absolute frames of reference induce in their speakers a kind of knock-on GPS. However, what the exact coordinates are, how fully they are specified, and whether or not they line up with the Western polar system of north, south, east, and west can vary greatly from language to language. We know from Chapter 7 that the Austronesians were excellent navigators.
2
The Austronesian language Balinese has an absolute frame of reference, but instead of four points, it has only two: one axis is determined by the monsoons and thus operates as a fixed and abstract axis, and the other axis is determined by a location of the central mountain on the island, such that one's relationship to it continually changes as one travels around the island (Levinson 2003:49, Wassman and Dasen 1998). As always, linguistic structures both support and are supported by cultural practices. In Bali, the absolute system is important in daily life and discourse, and there is evidence that Balinese children as young as four years old retain certain memories in terms of the absolute system (Levinson 2003:312).
3
The relative frame of reference, on the other hand, is viewer-centered, and the planes of the body serve as the coordinates with an up and a down, a front and a back, and a left and a right. So, the utterance
The man is to the left of the house
entails a viewer with a perspective on the situation that tells us the location of the man with respect to the house but gives us no information about the house itself. For those of us who live our lives through a language with a relative frame of reference, we think it entirely natural to use the planes of our body to orient ourselves and our fellows in our environments. However, many ethnogroups get along just fine in the world without ever distinguishing between left and right.
How do these frames of reference arise? Where do they come from? Body parts are often the sources for the names of different parts of objects, say, the foot of a mountain or the leg of a table. As we have just said, basic body parts are: the front, the back, and the sides. When terms become grammaticalized, they can then serve as generalized references. The utterance
The man is in front of the house
specifies the location of a man with respect to a facet of the house, a facet that is intrinsic to it, namely the front door. As with all things linguistic, such a seemingly and supposedly natural idea as
front
can have widely different interpretations across languages and cultures. In English,
front
refers to how one uses an object (front of the TV, front of the book, front of the car, etc.). In Tzeltal, spoken in Mexico, such a term for an object has nothing to do with how someone uses it but is rather based on the object's own facets. And other languages assign fronts to things English speakers do not. In Chamus, a Nilo-Saharan language, trees have fronts according to the direction they lean. From a linguistic point of view, the intrinsic frame of reference is basic. Only once we humans began to cognize the world through language could we imagine ourselves as one of the objects in the universe and create a frame of reference relative to ourselves in terms of left and right. However, the key point is that it is not necessary to create a relative frame of reference, if the ethnogroup is doing just fine with the intrinsic frame of reference. As Stephen Levinson in
Space in Language and Cognition
puts it: “The intrinsic frame of reference is close to linguistic bedrock, in that it is near universal â¦Â [and] children appear to acquire it earlier than other systems” (2003:81).
We begin with the human body in order to open up onto the deep-time story of how we humans have come to orient ourselves and others in our respective cognitive domains.
Although good navigational skills are not part of the order Primates, we primates are very good at social cognition, and we have eye, hand, and even foot preferences. Our foot preferences come from the arboreal existence of our ancestors. Living in trees in the primate past (and which some primates still do) has at least two consequences of note in the primate present: (i) to this day, primates tend to have one leg shorter than the other; and (ii) our primate ancestors had a preference for standing in trees while holding on to a branch with the right hand, leaving the left hand free for foraging. Thus it is in modern primates that the right
hemisphere is specialized for, among other things, spatial cognition and survival-related risks such as emergencies, while the left hemisphere is specialized for, among other things, routine behaviors. Keep in mind that the two hemispheres of the brain are in contralateral relationship to the two sides of the body, such that the left hemisphere mediates the right side of the body, while the right hemisphere mediates the left.
We will compare the human body with the nonhuman primate body by means of a scan from head to foot.
The primate brain (
Figure 10.1
) is three brains in one, beginning with the so-called reptilian brain. The structures of the reptilian brain include the spinal cord, the cerebellum, the medulla, and the pons. These structures control the body's vital functions such as balance, body temperature, breathing, fight or flight, heart rate, and hunger. On top of the reptilian brain sits the so-called mammalian brain. Its structures include the limbic system and the basal ganglia, and these structures mediate memories of good and bad experiences, emotion, and value judgments. The so-called primate brain is wrapped around the mammalian brain, and its structures include the cerebrum, or white matter, and the neo-cortex, or gray matter. If the thin cortical rim were peeled off and flattened out (
cortex
means âbark'), the neo-cortex of the monkey brain would have enough to cover a postcard, apes could cover one sheet of paper, and humans could cover four sheets of paper.
In nonhuman primates, the cortical area mediating vocalization is in the limbic association cortex, an area that has an intermediate position between the functions of the limbic system and the cerebral cortex. Because nonhuman primate vocalizations (and their conspecifics' responses to them) are largely involuntary responses to stimuli, often predators, it makes sense that the limbic system handles this function. In humans, by way of contrast, the language-production area has moved into the prefrontal neo-cortex. It is furthermore found in the left hemisphere of most people,
with some exceptions for a subsection of left-handers who have this area in their right hemisphere and an even smaller percentage of people who have this area distributed in both hemispheres. This location also makes sense because, for humans, language is a routine behavior.
The language-production center in humans is known as
Broca's area
, and it is named after the nineteenth-century neurologist who discovered the relationship of this brain region to language production. People with damage to this area owing to accident, stroke, or tumor have difficulty producing speech. They can understand what is being said to them, and they know what they want to say, but they simply cannot get it out. Broca's area is adjacent to the motor cortex, which location goes a long way to explaining its relationship to speech production. When dealing with language and the brain and damage, everything depends on location, location, location: where the lesions occur and how extensive they are will determine the type of disability and the possibilities of recovery. For instance, a lesion in one part of the brain can cause the loss of color vision, although the person can still name colors, while a lesion in another part of the brain will cause the person to lose the ability to name colors, although they can still separate colors into categories. What is clear, however, is that everything is connected and that Broca's area is not the be-all and end-all of speech production. Evolutionary biologist Philip Lieberman notes that, “Aphasia â permanent loss of language â never occurs in the absence of subcortical damage; it occurs only after damage to the basal ganglia and pathways to them, leaving Broca's area intact” (2013).
Another language area in humans is known as
Wernicke's area
. People with damage to this area cannot understand what is being said to them. They cannot answer a simple question, and whatever they do say makes no sense at all, although they say it perfectly grammatically and with no production difficulties. Just as Broca's is called a
production disorder
, so Wernicke's is called a
reception disorder
. The speech these people produce is fluent nonsense. Wernicke's area is near primary auditory cortex, and it is possible that the person so afflicted can neither hear nor understand what is being said, nor can this individual monitor his or her own speech. There is another area near the auditory cortex not far from Wernicke's called Heschel's, and lesions to this area produce word deafness. The person can still hear environmental noises, but the person has lost the ability to hear language. They may well retain the ability to speak, read, and write.
Two points need to be made. First, we do not want to say that language is
in
these areas any more than we want to say that walking is in the legs; rather, Broca's, Wernicke's, and Heschel's are areas where certain modalities of interactions converge in order to synthesize a behavior. If there is a disruption in the area, the behavior cannot be synthesized, and thus the behavior cannot be produced well or at all. Although there are important localizations in the brain for certain language behaviors, we think of language as a whole-brain activity such that damage to almost any area of the brain can have undesirable linguistic effects. For instance, there is a structure in the basal ganglia called the putamen, which aids mammals in the movement of limbs. In a multilingual human, damage to the putamen may impede this person's ability to stay on track in one language or another. Furthermore, damage to the cerebellum, which is in the reptilian brain, can cause speaking, listening, reading, and writing deficits and dysfluencies.
Second, areas such as Broca's, Wernicke's, and Heschel's are not necessarily devoted exclusively to language production and reception, and damage to these areas will cause other motor disorders, such as palsies, and other kinds of cognitive disabilities. Because there are so many noninvasive brain-imaging technologies available today, it is no longer necessary to determine the particulars of brain function through the study of how and when it malfunctions. It has been shown, for instance, that Broca's area is involved in such nonlinguistic tasks as searching for a target hidden within a complex geometric pattern, solving math problems, holding information in working memory, and perhaps supporting musical abilities.
When a physical object is set into vibration, a sound wave is produced. Plucking a guitar string, tapping a piano key, striking a tuning fork, and vibrating the vocal folds in the human larynx all produce sound waves, which propagate through a physical medium. For human language, the typical medium is air. The waves move from the sound source, namely the vocal tract, when air molecules compress together and spread apart. The human ear is adapted to frequencies of human speech, and so articulation and acoustics are therefore two sides of the same coin.
We perceive speech when a sound wave reaches our eardrum and is processed by the part of the cerebral cortex known as the auditory cortex. Part of what we perceive is pitch. We perceive a sound to be high pitched or low pitched depending on the rate at which the physical object â our vocal folds, for example â vibrate. This rate of vibration is known as the fundamental frequency and is measured in Hertz (Hz), or cycles per second. Different voiced sounds (vowels and voiced consonants) are produced when the sound wave is filtered by the human vocal tract. When we say that [a] is a different sound from [i], it is not only because the former is made by retracting the tongue body while opening the mouth, and the latter is made by advancing and raising the tongue body while keeping the lips spread, but also because these articulatory movements filter the acoustic sound wave being emitted by the vocal folds. The shape the vocal tract assumes when articulating [a] creates a unique filter that amplifies certain components of the sound wave while dampening others, resulting in particular sound waves that we perceive as speech.
Linguistic tones, in a language like Tibetan or Vietnamese, are identifiable levels or movements of pitch that occur when speakers vary the frequency of a vowel. Humans are capable of perceiving sounds within the range of 20 Hz to 20 kHz, although most human voices use a pitch range of between 80 Hz and 250 Hz.
The lower part of the nonhuman primate face has none of the mobility and coordination of the human mouth, jaw, and tongue. In the area around the mouth, humans have around 100 muscles. When we are speaking, these muscles respond to one of three commands every couple of milliseconds: relax, contract, or maintain tension. This is a complex physical feat. The complexity of speech production is compounded by our ability to anticipate upcoming sounds and the lip gestures required to articulate them. Take the words
constrict
and
construe
. In the first word, when we pronounce
the vowel of the second syllable, our lips are not rounded. In the latter word, when we pronounce the vowel of the second syllable, our lips are rounded. Now, say these two words out loud, but stop yourself a split second before completing the first syllable. You will notice that just as you are forming the consonant cluster
str-
your mouth has already anticipated the lip shape for the upcoming vowel.
4
No nonhuman primate can do this. In the 1970s, when researchers began to study the possibility of teaching human language to chimps, they had to construct boards with lexigrams the chimps could manipulate with their hands in order to make requests or to answer questions.
In comparison with all other animals, the human larynx, which houses the vocal folds, commonly called âvocal chords,' is low in the throat. This lowered position increases the possibility of choking but enhances humans' ability to produce speech by creating more space for the vocal tract. Human babies are born with the larynx relatively high in the throat. It lowers slowly, and at around three months, the larynx is low enough for the baby to begin to babble. By the end of the first year of life, the larynx has settled into position. Babbling is a very important activity, because it creates connections between the larynx and the prefrontal cortex, and all babies babble, even deaf babies babble, because it is an activity of a normally maturing body. If a baby does not babble, it is a sign that the child will have severe cognitive deficits. Studies have shown that the sounds of babbling babies vary with the ambient language, so that babies in French-speaking environments babble slightly differently than babies in English-speaking environments.
In humans, the larynx is the point of contact between the viscera and the prefrontal cortex. The relationship between viscera and the skeleton in speech is the inverse of the relationship between the viscera and the skeleton in nonhuman primate vocalizations. In the latter, the relationship is more like that when humans are laughing or crying: the skeleton is stable, while the viscera are in motion, and the breath is being panted in or out. In the former, namely speech, the skeleton is mobile, the viscera are stable, and the breath is flowing across the places of articulation, for example, larynx, velum, palate, teeth, and lips. In this context, it is pertinent to note that babies only babble when they are calm. When they are upset, we know what they do: they cry. Speech, by way of contrast to crying, is a routine behavior of a relaxed nervous system, which is why it is so difficult to speak while crying.
Human respiration is different than nonhuman primate respiration in that humans have cortical control of the diaphragm. We manage the air pressure in our lungs by the length of what we have to say, and we time our breathing accordingly. Evolutionary biologist Terrence Deacon speculates that the human adaptation of respiration to speech was two million years in the making.
We primates are a touchy-feely bunch. It is difficult for us to keep our hands to ourselves and off other things and other people. Although we humans no longer pick
nits off our best friends' skins, we understand the grooming behaviors of our cousins. Evolutionary psychologists Louise Barrett and Robin Dunbar believe language started as a kind of grooming behavior at a distance. Indeed, when two people are no longer on speaking terms, it is akin to being off one another's grooming list or, in more modern terms, defriended from social media. There is some truth to the idea that the kinds of partnerships nonhuman primates create through actual touch humans now do with language. With spoken language, human hands are now free to do other things, namely gesture communicatively, which nonhuman primates do not do.
When speaking of communicative hand gestures, we do not mean the kind when you twirl your forefinger around your temple to signal âcrazy' or when you put your thumb up to signal âgood, okay.' These gestures are culturally conventional and conscious, and they are called
emblems
. By way of contrast, the gestures of the speechâgesture circuit are idiosyncratic, manufactured on the fly, imagistic, and largely unconscious. This circuit is, furthermore, an integrated whole. People who suffer from Broca's impairment not only have disrupted speech but also have halting, stutter-like gestures while attempting to speak. The gestures of people who suffer from Wernicke's impairment are like their speech in that they are fluid but âall over the place.'
The strange case of IW
5
confirms the integrity of the speechâgesture circuit. At age 19, IW lost all sense of touch and proprioception below the neck. This meant he lost all motor control that depends on bodily feedback. For instance, when we walk, we receive proprioceptive feedback from the soles of our feet, and this feedback keeps us on track and moving forward. IW slowly relearned to walk and eat, and do everything else, not by regaining his lost proprioception and spatial position sense but by using cognition and vision. Of interest here is that fact that his ability to gesture when speaking was not lost, while instrumental gesture was. That is, he cannot pick up a brick if he cannot see his hands. However, he can and does gesture while speaking, even if he cannot see his hands.