Of Minds and Language (24 page)

Read Of Minds and Language Online

Authors: Pello Juan; Salaburu Massimo; Uriagereka Piattelli-Palmarini

BOOK: Of Minds and Language
10.69Mb size Format: txt, pdf, ePub

There is thus syntax and there is discourse (and of course there is pronunciation/visualization), and that is all there is. Beyond the possible forms that the computational system of language provides, there are no thoughts that you can think. You can of course think associative, poetic, non-propositional kinds of thoughts, too. But these are not the ones we are (here) trying to naturalize (or explain). It also follows from this that to whatever extent non-human animals partake in the systematicity and propositionality of human thought, they partake in whatever fragments of the computational system are needed to think them.

9.4 Building structure: Merge

Obviously this suggestion depends on getting clearer on what kinds of structures the computational system of language actually builds, and how. It is noteworthy in this regard that recent “minimalist” theorizing in the study of
language has seen a rather severe deflowering of trees. While in the early days of Government & Binding Theory they were still richly decorated, with three-layered sub-trees built by the immutable laws of X-bar theory, and in the early days of minimalism (Chomsky 1995b) at least we still had “projections” of a somewhat meagre sort, as in (2), we now are left with (3) (Collins 2002):

The reason for this deflowering is closely linked to the rather sad history of categorical labels (like NP, P, V′, and so on), familiar from the days of Phrase Structure Grammar. First, they were demoted to the status of lexical items, and deprived of the X-bar theoretic bar-stroke that marked them as something else than that. So, for example, {the, man} would not be a D′, or DP, it would just be “the”: labels such as this were said to be “designated minimal elements” in a syntactic object, whose job is to carry all the information relevant for the way that object enters into further computation. But then the drive of the minimalist program in recent years has been to show that the same information follows even without designating out such elements, so that labels are eliminable after all (Chomsky 2006).

I will assume that this whole development is well-motivated within the assumptions that ground it. The deepest motivation is the elimination of a phrase-structure component in the grammar in favor of the sole operation Merge, defined as recursive set-formation. This operation I will now describe in more detail. Suppose, with Chomsky (2005a), that Merge is an operation that merely forms a set out of
n
elements, taken from some “lexicon.” Taking
n
=1 as a special case, so that we have a one-membered lexicon, let us identify it for concreteness with the empty set. Merge then enumerates the following sequence:

The function carrying us from any element in this series to its immediate successor is effectively the
successor function
, viewed as a generative principle that forms an immediate successor of an element simply by creating a set of which that element is the only member. We could also define this function in the way that each immediate successor of every such element is the set containing all and only its predecessors, and thus the entire history of its generation:

Clearly, both (4) and (5) give us a discretely infinite series. We could then move further from here, and define algebraic operations such as addition, by which we can combine any two such elements. The result will be a mathematical space, structured by certain relations. We could also add operations that carry us out of this space, such as subtraction, which opens up a new space, the space of the negatives, or division, which opens the space of the rational numbers. These spaces are not unrelated, in fact some of them come to
contain
entire other such spaces: the reals, say, entail the rationals, the rationals entail the naturals. So, it's really quite a playing-field.
5
With each operation we add, our spaces get richer, and eventually there will be a whole hierarchy of spaces ordered by a relation of containment, and depending on the space on which the objects we generate live, they behave quite differently: they are different kinds of objects, and we therefore create different kind of
ontologies
. These may relate quite regularly and systematically to one another, in the way, say, that a geometrical object such as a globe, e.g. the Earth, “contains” a surface as another, lower-dimensional kind of object, and that surface relates to the equator, a line, again a lower-dimensional kind of object (see further Uriagereka 2008:
Chapter 8
, for a discussion of topological notions in linguistics). The length of such a “chain” of different kinds of objects that contain one another is called the
dimension
of a space. Crucially, a space generated by the operation Merge in (4), interpreted geometrically, would be only
one-dimensional
. Geometrically, we can view it as a line.

What is the point of all this?
6
Chomsky (2005a), when discussing the above analysis of Merge, suggests that arithmetic and language are evolutionarily
linked domains of cognition in which we find the same basic operation Merge instantiated. Merge in language, on this view, is simply an instance of a more general operation that generates the natural numbers in arithmetic, too, yielding a discretely infinite space in both cases. I come to how this works for language in a moment. For now what is important is this: Chomsky's viewpoint opens up the path for looking at syntactic objects from an abstract algebraic point of view, and for asking: what kind of algebraic space do syntactic objects form or inhabit? What is its dimensionality? Obviously, a single-dimensional space will only contain one category of object. All of its objects, that is, only vary along one dimension. A multi-dimensional space, like the numbers, on the other hand, will contain multiple categories of objects, many or all of them defined in terms of operations on lower-level objects, as we saw. What we need to see here is that if we view Merge on the model of the sequence in (4), above, then it is a consequence that Merge will only ever give us one kind of object, in either arithmetic or language. Merge will never carry us outside the one-dimensional space that it constructs. Our basic computational operation, therefore, won't be, as I shall say,
ontologically productive
: it will never generate new kinds of objects, ever.

I will suggest that this is a bad result, and just as Merge is a far too poor basis to yield the rest of arithmetic (all that goes beyond the integers), a naïve adaptation of Merge or Succ in (4) and (5) to the domain of language does not give us its full richness either. In the mathematical case, other operations generating inverses, at least, will have to be added in order for mathematical knowledge to come to exist in its present form. And if arithmetic is to be an evolutionary offshoot of language, as Chomsky (2005a) plausibly suggests, basic structure-building operations in language might therefore be richer as well.

9.5 Merge in language

Let me now be more explicit about the connection between (4) and the use of the same
n
-ary operation Merge in the linguistic system. It is commonly suggested that the restriction of Merge to
n
=2 arguments follows from “interface conditions,” and I shall assume so here as well, for the sake of argument. There are then two “lexical” elements to which Merge needs to apply. Say we combine the list (6) into the set (7) first:

(6)    kill, buffalos

(7)    {kill, buffalos}

Then, due to the recursiveness of Merge, this must be a Merge-partner again, which, together with, say, a new lexical item,
Hill
, yields (8), and with a further morpheme -
ed
, yields (9):

(8)    {Hill, {kill, buffalos}}

(9)    {-ed, {Hill, {kill, buffalos}}}

If we allow Merge to apply ‘internally' (target a syntactic object inserted earlier in the syntactic object already constructed), it can target ‘kill' and copy it at the root, so as to obtain (10), and it can target ‘Hill' in the same way, to yield (11):

(10)    {kill-ed, {Hill, {kill, buffalos}}}

(11)    {Hill, {kill-ed, {Hill, {kill, buffalos}}}}

If, finally, we knock out the phonetic values of lower copies of these lexical items, we obtain the well-formed, past-Tense sentence (12),
Hill killed buffalos
:

(12)    {Hill, {kill-ed, { … , { … , buffalos}}}}

As Chomsky correctly suggests, we do get hierarchy and unbounded embedding on this minimal vision of structure-building, hence discrete infinity, essentially for free. Yet, in my terms of the previous section, this hierarchy is mono-categorial. There is nothing more complex going on here than in (4) or (5). A ready defense of the minimalist conception of Merge, against the background of the standard architectural assumptions depicted in
Fig. 9.1
, could now be that, of course, different syntactic objects will yield categorially different semantic interpretations
at the interface
(when they are interpreted). But, in that case, there will be nothing in the syntax from which it could follow
why
this is so. All the syntax ever sees, on the standard conception, are lexical items, or else projections of lexical items, which, however, as we have seen, collapse into lexical items. If the presumed external “conceptual-intentional” or “C-I” systems make such a differentiation occur, they have to be formally
richer
than the structures we find in the human language system, as viewed on that conception. This is
highly
implausible in the light of the fact that the supposed C-I systems are thought to have whatever structure they have, independently of and prior to those we find in the language system. Looking at non-human animal accomplishments, this seems a very long shot indeed (see Penn et al. (in press) for a recent review of the comparative literature; or Terrace 2005 on iterated skepticism that propositionality is within the scope of the non-human animal mind).

If we go the interface route, we will have merely pushed the burden of explanation. Structured thought in the putative C-I systems needs, well, structures – ones appropriate to the cognitive task. And if these structures are
not formally equivalent to the ones we find in language, the danger is we won't quite know what we are talking about. Where we address the structures of thought empirically, the place where this inquiry leads us back to usually is the very structures that the computational system of language provides for our thoughts. Even if we knew how to investigate the interface in language-independent terms, and we found an entirely independent system there, the strange tightness with which specific syntactic categories get paired with specific semantic constructs will seem mysterious: if the categories are there for independent reasons, as part of the constitution of these C-I systems, why should such a syntactic specificity occur, and how could it be motivated? How could it be that we can actually study specific semantic phenomena, say predication, in syntactic terms, and that this provides advances in our understanding of the phenomena in question?

I propose, then, that we consider a radically different conclusion, that it is the
syntax
that yields a richly differentiated set of objects, as opposed to single ontology: it yields different
categories
, where each category corresponds to an ontology, and an ontology may necessarily entail or contain another: thus, a fully specified and tensed proposition (‘That Caesar destroyed Syracuse') clearly entails an
event
configured in a transitive verbal phrase (‘Caesar destroy Syracuse,' without Tense), which in turn entails a
state
(Syracuse's being destroyed), which in turn entails an object that is
in
that state, the object Syracuse itself.
7
These “vertical” hierarchies

Other books

All Flash No Cash by Randi Alexander
Swept off Her Feet by Browne, Hester
Surrender of a Siren by Tessa Dare
Holiday of the Dead by David Dunwoody, Wayne Simmons, Remy Porter, Thomas Emson, Rod Glenn, Shaun Jeffrey, John Russo, Tony Burgess, A P Fuchs, Bowie V Ibarra
9780981988238 by Leona Wisoker
Annabelle by Beaton, M.C.
Untamed Hearts by Melody Grace
The Ex by Abigail Barnette
The Troop by Nick Cutter