Read Of Minds and Language Online

Authors: Pello Juan; Salaburu Massimo; Uriagereka Piattelli-Palmarini

Of Minds and Language (5 page)

BOOK: Of Minds and Language
6.11Mb size Format: txt, pdf, ePub
ads

At the time of the 1974 biolinguistics conference, it seemed that the language faculty must be rich, highly structured, and substantially unique to this cognitive system. In particular, that conclusion followed from considerations of language acquisition. The only plausible idea seemed to be that language acquisition is rather like theory construction. Somehow, the child reflexively categorizes certain sensory data as linguistic experience, and then uses the experience as evidence to construct an internal language – a kind of theory of expressions that enter into the myriad varieties of language use.

To give a few of the early illustrations for concreteness, the internal language that we more or less share determines that sentence (3a) is three-ways ambiguous, though it may take a little reflection to reveal the fact; but the ambiguities are resolved if we ask (3b), understood approximately as (3c):

(3)    a. Mary saw the man leaving the store

         b. Which store did Mary see the man leaving?

         c. Which store did Mary see the man leave?

The phrase
which store
is raised from the position in which its semantic role is determined as object of
leave
, and is then given an additional interpretation as an operator taking scope over a variable in its original position, so the sentence means, roughly:

for which x, x a store, Mary saw the man leav(ing) the store x

– and without going into it here, there is good reason to suppose that the semantic interface really does interpret the variable x as
the store x
, a well-studied phenomenon called “reconstruction.” The phrase that serves as the restricted variable is silent in the phonetic output, but must be there for interpretation. Only one of the underlying structures permits the operation, so the ambiguity is resolved in the interrogative, in the manner indicated. The constraints involved – so-called “island conditions” – have been studied intensively for about forty-five years. Recent work indicates that they may reduce in large measure to minimal search conditions of optimal computation, perhaps not coded in UG but more general laws of nature – which, if true, would carry us beyond explanatory adequacy.

Note that even such elementary examples as this illustrate the marginal interest of the notions “well-formed” or “grammatical” or “good approximation to a corpus,” however they are characterized.

To take a second example, illustrating the same principles less transparently, consider sentences (4a) and (4b):

(4)    a. John ate an apple

         b. John ate

We can omit
an apple
, yielding (4b), which we understand to mean
John ate something unspecified
. Now consider

(5)    a. John is too angry to eat an apple

         b. John is too angry to eat

We can omit
an apple
, yielding (5b), which, by analogy to (4b) should mean that
John is so angry that he wouldn't eat anything
. That's a natural interpretation, but there is also a different one in this case: namely,
John is so angry that someone or other won't eat him, John
– the natural interpretation for the structurally analogous expression

(6) John is too angry to invite

In this case, the explanation lies in the fact that the phrase
too angry to eat
does include the object of
eat
, but it is invisible. The invisible object is raised just as
which store
is raised in the previous example (3), again yielding an
operator-variable structure. In this case, however, the operator has no content, so the construction is an open sentence with a free variable, hence a predicate. The semantic interpretation follows from general principles. The minimal search conditions that restrict raising of
which store
in example (3) also bar the raising of the empty object of
eat
, yielding standard island properties.

In both cases, the same general computational principles, operating efficiently, provide a specific range of interpretations as an operator-variable construction, with the variable unpronounced in both cases and the operator unpronounced in one. The surface forms in themselves tell us little about the interpretations.

Even the most elementary considerations yield the same conclusions. The simplest lexical items raise hard if not insuperable problems for analytic procedures of segmentation, classification, statistical analysis, and the like. A lexical item is identified by phonological elements that determine its sound along with morphological elements that determine its meaning. But neither the phonological nor morphological elements have the “beads-on-a-string” property required for computational analysis of a corpus. Furthermore, even the simplest words in many languages have phonological and morphological elements that are silent. The elements that constitute lexical items find their place in the generative procedures that yield the expressions, but cannot be detected in the physical signal. For that reason, it seemed then – and still seems – that the language acquired must have the basic properties of an internalized explanatory theory. These are design properties that any account of evolution of language must deal with.

Quite generally, construction of theories must be guided by what Charles Sanders Peirce a century ago called an “abductive principle,” which he took to be a genetically determined instinct, like the pecking of a chicken. The abductive principle “puts a limit upon admissible hypotheses” so that the mind is capable of “imagining correct theories of some kind” and discarding infinitely many others consistent with the evidence. Peirce was concerned with what I was calling “the science-forming faculty,” but similar problems arise for language acquisition, though it is dramatically unlike scientific discovery. It is rapid, virtually reflexive, convergent among individuals, relying not on controlled experiment or instruction but only on the “blooming, buzzing confusion” that each infant confronts. The format that limits admissible hypotheses about structure, generation, sound, and meaning must therefore be highly restrictive. The conclusions about the specificity and richness of the language faculty follow directly. Plainly such conclusions make it next to impossible to raise questions that go beyond explanatory adequacy – the “why” questions – and
also pose serious barriers to inquiry into how the faculty might have evolved, matters discussed inconclusively at the 1974 conference.

A few years later, a new approach suggested ways in which these paradoxes might be overcome. This principles and parameters (P&P) approach was based on the idea that the format consists of invariant principles and a “switch-box” of parameters – to adopt Jim Higginbotham's image. The switches can be set to one or another value on the basis of fairly elementary experience. A choice of parameter settings determines a language. The approach largely emerged from intensive study of a range of languages, but as in the early days of generative grammar, it was also suggested by developments in biology – in this case, François Jacob's ideas about how slight changes in the timing and hierarchy of regulatory mechanisms might yield great superficial differences (a butterfly or an elephant, and so on). The model seemed natural for language as well: slight changes in parameter settings might yield superficial variety, through interaction of invariant principles with parameter choices. That's discussed a bit in Kant lectures of mine at Stanford in 1978, which appeared a few years later in my book
Rules and Representations
(1980).

The approach crystallized in the early 1980s, and has been pursued with considerable success, with many revisions and improvements along the way. One illustration is Mark Baker's demonstration, in his book
Atoms of Language
(2001), that languages that appear on the surface to be about as different as can be imagined (in his case Mohawk and English) turn out to be remarkably similar when we abstract from the effects of a few choices of values for parameters within a hierarchic organization that he argues to be universal, hence the outcome of evolution of language.

Looking with a broader sweep, the problem of reconciling unity and diversity has constantly arisen in biology and linguistics. The linguistics of the early scientific revolution distinguished universal from particular grammar, though not in the biolinguistic sense. Universal grammar was taken to be the intellectual core of the discipline; particular grammars are accidental instantiations. With the flourishing of anthropological linguistics, the pendulum swung in the other direction, towards diversity, well captured in the Boasian formulation to which I referred. In general biology, a similar issue had been raised sharply in the Cuvier–Geoffroy debate in 1830.
18
Cuvier's position, emphasizing diversity, prevailed, particularly after the Darwinian revolution, leading to the conclusions about near infinitude of variety that have to be sorted out case by case, which I mentioned earlier. Perhaps the most quoted sentence in biology is Darwin's final observation in
Origin of Species
about how “from so simple a
beginning, endless forms most beautiful and most wonderful have been, and are being, evolved.” I don't know if the irony was intended, but these words were taken by Sean Carroll (2005) as the title of his introduction to
The New Science of Evo Devo
, which seeks to show that the forms that have evolved are far from endless, in fact are remarkably uniform, presumably, in important respects, because of factors of the kind that Thompson and Turing thought should constitute the true science of biology. The uniformity had not passed unnoticed in Darwin's day. Thomas Huxley's naturalistic studies led him to observe that there appear to be “predetermined lines of modification” that lead natural selection to “produce varieties of a limited number and kind” for each species.
19

Over the years, in both general biology and linguistics the pendulum has been swinging towards unity, in the evo-devo revolution in biology and in the somewhat parallel minimalist program.

The principles of traditional universal grammar had something of the status of Joseph Greenberg's universals: they were descriptive generalizations. Within the framework of UG in the contemporary sense, they are observations to be explained by the principles that enter into generative theories, which can be investigated in many other ways. Diversity of language provides an upper bound on what may be attributed to UG: it cannot be so restricted as to exclude attested languages. Poverty of stimulus (POS) considerations provide a lower bound: UG must be at least rich enough to account for the fact that internal languages are attained. POS considerations were first studied seriously by Descartes to my knowledge, in the field of visual perception. Of course they are central to any inquiry into growth and development, though for curious reasons, these truisms are considered controversial only in the case of language and other higher human mental faculties (particular empirical assumptions about POS are of course not truisms, in any domain of growth and development).

For these and many other reasons, the inquiry has more stringent conditions to satisfy than generalization from observed diversity. That is one of many consequences of the shift to the biolinguistic perspective; another is that methodological questions about simplicity, redundancy, and so on, are transmuted into factual questions that can be investigated from comparative and other perspectives, and may reduce to natural law.

Apart from stimulating highly productive investigation of languages of great typological variety, at a depth never before even considered, the P&P approach also reinvigorated neighboring fields, particularly the study of language acquisition, reframed as inquiry into setting of parameters in the early years of life. The shift of perspective led to very fruitful results, enough to suggest that the basic contours of an answer to the problems of explanatory adequacy might be visible. On that tentative assumption, we can turn more seriously to the “why” questions that transcend explanatory adequacy. The minimalist program thus arose in a natural way from the successes of the P&P approach.

The P&P approach also removed the major conceptual barrier to the study of evolution of language. With the divorce of principles of language from acquisition, it no longer follows that the format that “limits admissible hypotheses” must be rich and highly structured to satisfy the empirical conditions of language acquisition, in which case inquiry into evolution would be virtually hopeless. That might turn out to be the case, but it is no longer an apparent conceptual necessity. It therefore became possible to entertain more seriously the recognition, from the earliest days of generative grammar, that acquisition of language involves not just a few years of experience and millions of years of evolution, yielding the genetic endowment, but also “principles of neural organization that may be even more deeply grounded in physical law” (quoting from my
Aspects of the Theory of Syntax
(1965), a question then premature).

Assuming that language has general properties of other biological systems, we should be seeking three factors that enter into its growth in the individual: (1) genetic factors, the topic of UG; (2) experience, which permits variation within a fairly narrow range; (3) principles not specific to language. The third factor includes principles of efficient computation, which would be expected to be of particular significance for systems such as language. UG is the residue when third-factor effects are abstracted. The richer the residue, the harder it will be to account for the evolution of UG, evidently.

Throughout the modern history of generative grammar, the problem of determining the general nature of language has been approached “from top down,” so to speak: how much must be attributed to UG to account for language acquisition? The minimalist program seeks to approach the problem “from bottom up”: how little can be attributed to UG while still accounting for the variety of internal languages attained, relying on third-factor principles? Let me end with a few words on this approach.

BOOK: Of Minds and Language
6.11Mb size Format: txt, pdf, ePub
ads

Other books

The Governor's Wife by Mark Gimenez
Undead for a Day by Chris Marie Green, Nancy Holder, Linda Thomas-Sundstrom
Plexus by Henry Miller
Wellspring of Chaos by L. E. Modesitt
The History of History by Ida Hattemer-Higgins
The Swap by Shull,Megan