Read Of Minds and Language Online

Authors: Pello Juan; Salaburu Massimo; Uriagereka Piattelli-Palmarini

Of Minds and Language (43 page)

BOOK: Of Minds and Language
11.4Mb size Format: txt, pdf, ePub
ads

H
IGGINBOTHAM
: Yes, to get induction, you need something more. You need the idea that for any number
x
, if I make enough strokes, I can get to
x
.

G
ELMAN
: Yes, we didn't ask that one, but there is another one where we asked the question in the Cantorial way. That is, children who were having no trouble with our initial infinity interview were engaged in a version of Cantor's proof. We had drawings of hands in a line, each of which was holding hands with a numeral in a parallel line placed in one-to-one correspondence. We then asked whether we could keep adding hands and numerals, one at a time. This done, we went on to ask whether there were as many hands as numerals. The children agreed. In fact, they agreed at first that equivalence would hold if each person was paired with an odd number. The kids would say yes, probably because they had said yes to the first questions. “You know, they had the same answer.”

But then when we pointed out the contradiction, that we were skipping every even number, the reaction was, “Oh no, this is crazy, lady. Why are you wasting my time?” It probably is the case that even these children did not understand the abstract notions that follow from one-to-one correspondence. However, it is not so easy to develop a task that is free of confounding variables. The trick is to figure out exactly how to ask what you want to get at. And it isn't that easy, because you have to tell them, “I want you to tell me what the induction is,” without telling them that I want you to tell me that. My bottom line? Be careful about saying that there are groups of people who cannot count with understanding, who have only a few number words.

P
IATTELLI-PALMARINI
: You mentioned quantifiers versus numbers, and not surprisingly, numbers are easier than quantifiers. In fact, there is a dissertation in Maryland, by Andrea Gualmini, showing that children have a problem in understanding quantifiers until very, very late.
4
Do you have further data on the understanding of quantifiers?

G
ELMAN
: The question of when quantifiers are understood is very much complicated by the task. I don't know that dissertation, but I know studies from the 1970s showing that the quantifier tasks (
all
and
some
, etc.) were not handled well until 6 years of age. We actually have been able to change the alligator task (Hurewitz et al. 2006) so that the kids do very well on
all
and
some
questions. The problem is, fundamentally, that we are talking about a set-theoretic concept. Once you make it easier, move them out of the full logic of class inclusion or one-on-one correspondence, the task does get easier, but that is in a sense the point of why I don't understand why anybody thinks the quantifiers are a primitive out of which come the count numbers. The formal rules for quantifiers, whichever formal system you go into – it is going to be different, because whatever that system is, it will have a different notation, there will be different rules about identity elements than there are in arithmetic, and the effect of adding we automatically know is different. I mean, if you add
some
to
some
, you get
some
. If you add 1 to 1, you don't get 1. So these are very different systems, and furthermore, the quantifiers are very context-sensitive. It depends on what numbers you are working with. So when we looked across the tasks, we could start doing task analysis, but we haven't done it completely.

U
RIAGEREKA
: Just a brief follow-up on that. I think in principle it would be useful to bring in the notion of conservativity, which is quite non-trivial for binary quantifiers, as has been shown. So not only would you have numerals
versus quantifiers, but among the quantifiers, you would have the ones where in effect you have an order restriction and a scope, versus the ones where you don't, and that probably can make a big difference too.

G
ELMAN
: I totally agree. I should just say I have no argument with that. This is not an accidental combination of people working together. We have, now, two faculty members who specialize in quantifiers and their acquisition, and these are all issues they have written about, are going to work on, and so on. My interest was that this was a way to demonstrate experimentally what I have written about as a purely formal distinction. I had tried to show why arguments about development that involve the count words coming out of the quantifiers didn't make any sense. But that was the logical argument. It was now nice to be able to show that they do behave separately.

URIAGEREKA: This partly also relates to the claim of context sensitivity, because strictly speaking, it is when you do have to organize the part that the quantifier leaves on with regard to the scope that you need massive context sensitivity, but not the other way around.

G
ELMAN
: Right.

CHAPTER 16
The Learned Component of Language Learning

Lila Gleitman

Isolated infants and children have the internal wherewithal to design a language if there isn't one around to be learned (e.g., Senghas and Coppola 2001). Such languages exhibit categories and structures that look suspiciously like those of existing languages. There are words like
horse
and
think
. Not only that: the mapping between predicate type and complement structure is also quite orthodox, as far as can be ascertained. For instance, even in very primitive instances of such self-made languages,
sleep
is intransitive,
kick
is transitive, and
give
is ditransitive (e.g., Feldman, Goldin-Meadow, and Gleitman 1978). This fits with recent demonstrations – one of which I mentioned during the round-table discussion (see page 207) – that even prelinguistic infants can discriminate between certain two- and three-argument events in the presence of the (same) three interacting entities (Gordon 2003). All of this considerable conceptual and interface apparatus being in place, and (“therefore”) language being so easy to invent, one might wonder why it's hard to acquire an extant language if you are unlucky enough to be exposed to one. For instance, only ten or so of the required 50,000 or so vocabulary items are acquired by normally circumstanced children on any single day; three or four years go by before there's fluent production of modestly complex sentences in all their language-particular glory. What takes so long?

The answer generally proposed to this question begins with the problem of word learning, and is correct as far as it goes: ultimately, lexical acquisition is accomplished by identifying concepts whose exemplars recur with recurrent phonetic signals in the speech or signing of the adult community. That is, we match the sounds to the scenes so as to pair the forms with their meanings. Owing to the loose and variable relations between word use and the passing scene – the “stimulus-free property of language use,” as Chomsky (1959c)
famously termed this – knowledge of these form/meaning relations necessarily accrues piecemeal over time and probabilistically over repeated exposures. But in the end (or so the story goes),
horse
tends to occur in the presence of horses, and
race
in the presence of racing, and these associations eventually get stamped in. Just so. (I will return presently to mention at least a few of the questions I am begging by so saying.)

Now here is a potentially hard question. Equating for frequency of utterance in caretaker speech, and presupposing the word-to-world pairing procedure just alluded to, some words are easier to acquire than others as indexed by the fact that they show up in the earliest vocabularies of infants all over the world. One general property of these novice vocabularies illustrates this point: The first-learned 100 or so words are – animal noises and ‘bye-bye's excluded – mainly terms that refer in the adult language to whole objects and object kinds, mainly at some middling or “basic” level of conceptual categorization (Caselli et al. 1995; Gentner and Boroditsky 2001; Goldin-Meadow, Seligman and Gelman 1976; Lenneberg 1967; Markman 1994; Snedeker and Li 2000). This is consistent with many demonstrations of responsiveness to objects and object types in the prelinguistic stages of infant life (Kellman and Spelke 1983; Needham and Baillargeon 2000).

In contrast, for relational terms the facts about understanding concepts do not seem to translate as straightforwardly into facts about early vocabulary. Again there are many compelling studies of prelinguistic infants' discrimination of and attention to several kinds of relations including containment versus support (Hespos and Baillargeon 2001), force and causation (Leslie and Keeble 1987), and even accidental versus intentional acts (Woodward 1998). Yet when the time comes to talk, there is a striking paucity of relational and property terms compared to their incidence in caretaker speech. Infants tend to understand and talk about objects first. Therefore, because of the universal linguistic tendency for object concepts to surface as nouns (Pinker 1984; Baker 2001), nouns heavily overpopulate the infant vocabulary as compared to verbs and adjectives, which characteristically express events, states, properties, and relations. The magnitude of this noun advantage from language to language is influenced by many factors, including ratio of noun to verb usage in the caregiver input (itself the result of the degree to which argument dropping is licensed), but even so it is evident in child speech to a greater or lesser degree in all languages that have been studied in this regard (Gentner and Boroditsky 2001). In sum, verbs as a class are “hard words” while nouns are comparatively “easy.” Why is this so?

An important clue is that the facts as just presented are wildly oversimplified. Infants generally acquire the word
kiss
(the verb) before
idea
(the noun) and
even before
kiss
(the noun). As for the verbs, their developmental timing of appearance is variable too, with words like
think
and know typically acquired later than verbs like
go
and
hit
. Something akin to “concreteness,” rather than lexical class
per se
, appears to be the underlying predictor of early lexical acquisition (Gillette, Gleitman, Gleitman and Lederer. 1999). Plausibly enough, this early advantage of concrete terms over more abstract ones has usually been taken to reflect the changing character of the child's conceptual life, whether attained by maturation or learning. Smiley and Huttenlocher (1995: 20) present this view as follows:

Even a very few uses may enable the child to learn words if a particular concept is accessible. Conversely, even highly frequent and salient words may not be learned if the child is not yet capable of forming the concepts they encode …cases in which effects of input frequency and salience are weak suggest that conceptual development exerts strong enabling or limiting effects, respectively, on which words are acquired.

A quite different explanation for the changing character of child vocabularies, the so-called syntactic bootstrapping solution (Landau and Gleitman 1985; Gleitman 1990; Fisher 1996; Gleitman et al. 2005; Trueswell and Gleitman 2007), has to do with information change rather than conceptual change. The nature of the vocabulary at different developmental moments is taken to be the outcome of an incremental multi-cue learning procedure instead of being a reflection of changes in the mentality of the learner:

(1) Several sources of evidence contribute to solving the mapping problem for the lexicon.

(2) These sources vary in their informativeness over the lexicon as a whole.

(3) Only one such source is in place when word learning begins: namely, observation of the word's situational contingencies.

(4) Other systematic sources of evidence have to be built up by the learner through accumulating linguistic experience.

(5) As the learner advances in knowledge of the language, these multiple sources of evidence are used conjointly to converge on the meanings of new words. These procedures mitigate and sometimes reverse the distinction between “easy” and “hard” words.

(6) The outcome is a knowledge representation in which detailed syntactic and semantic information is linked at the level of the lexicon.

According to this hypothesis, then, not all words are acquired in the same way. As learning begins, the infant has the conceptual and pragmatic wherewithal to interpret the reference world that accompanies caretaker speech, including the gist of caretaker–child conversations (to some unknown degree,
but see Bloom 2002 for an optimistic picture, which we accept). Words whose reference can be gleaned from extralinguistic context are “easy” in the sense we have in mind; that is the implication of point (3) above. By and large, these items constitute a stock of concrete nominals. Knowledge of such items, and the ability to represent the sequence in which they appear in speech, provides a first basis for constructing the rudiments of the language-specific clause-level syntax of the exposure language; that is, its canonical placement of nominal arguments and inflectional markings. This improved linguistic representation becomes available as an additional source of evidence for acquiring further words – those that cannot efficiently be acquired by observation operating as a stand-alone procedure. The primitive observation-only procedure that comprises the first stage of vocabulary growth is what preserves this model from the vicious circularity implied by the whimsical term “bootstrapping” (you can't pull yourself up by your bootstraps if you're standing in the boots), and is very much in the spirit of Pinker's (1984) “semantic bootstrapping” proposal, with the crucial difference that by and large the initial procedure yields almost solely concrete noun learning. Structure-aided learning (“syntactic bootstrapping”), required for efficient acquisition of the verbs and adjectives, builds upward by piggybacking on these first linguistic representations. An important implication of the general approach is that word learning is subject to the same general principles over a lifetime (for laboratory demonstrations, see Gillette, Gleitman, Gleitman and Lederer 1999; Snedeker and Gleitman 2004). For the same reasons, these principles should and apparently do apply to vocabulary acquisition in later-learned as well as first languages (Snedeker, Geren and Shafto 2007).

BOOK: Of Minds and Language
11.4Mb size Format: txt, pdf, ePub
ads

Other books

Mooch by Dan Fante
Rise From Darkness by Ciara Knight
Boy Who Made It Rain by Brian Conaghan
Stand the Storm by Breena Clarke
Junk Miles by Liz Reinhardt
Savage Cinderella by PJ Sharon