Mathematics and the Real World (44 page)

BOOK: Mathematics and the Real World
10.57Mb size Format: txt, pdf, ePub

The anchoring effect may be viewed as irrational. The axiom that an irrelevant factor is not relevant, that is, it is not meant to have an effect, would be accepted by any reasonable person. It is not always easy to recognize when a factor is relevant and when it is irrelevant, but the random outcome of a spin of a roulette wheel is definitely irrelevant to estimates of the population of anywhere. But the fact is that the roulette wheel outcome does have an influence. This is not really surprising. The effect reflects evolutionary rationality. People who have to make a decision do not have
time to “waste” on an intelligent analysis of what is relevant and what is not and, after such analysis, to ignore the irrelevant data. We have already stated that the brain cannot think subject to conditions and axioms that are imposed or examine logically what is relevant and what is not. The brain looks for any data that arise in connection with a decision and decides according to those data, without checking and clarifying the details of the extent of their relevance. This system is apparently more efficient from the cost/benefit aspect in most cases where action or a decision is required, and it therefore became ingrained by evolution in human behavior.

The above example indicates one of the structures according to which the brain operates when action or a decision is called for.
The brain completes the picture of the situation with the information it already holds, without carefully checking the relevance and logical impact of the information
. We saw other examples of such analyses in section 42, when we examined answers given intuitively compared with those given following calculations based on Bayes's formula. Although the questions lacked information that was essential to solving the problem, those questioned were unaware of that when they answered intuitively. The brain completes the picture in such a way that it can provide an answer.

The following is another example (taken from linguistics) of the implications of the claim about what the brain already knows, without a careful logical examination. Try to rephrase the following:

There is no head injury too minor for you to ignore.

The great majority of people coming across this statement understand it to mean that every head injury, however minor, should be treated. A close examination of what it says will reveal that it claims the opposite, that no head injuries at all need to be treated! What causes the confusion? First, as we claimed back in section 5, the brain has difficulty in analyzing claims that contain a negative, as in sentences that start with “there is no” or that have the word “ignore,” with its inherent negation. At the same time, the brain recognizes the content (not the meaning) of the claim, as it has often come across warnings about head injuries. It therefore combines
the general content of the claim with what it already knows and skips the logical analysis. Evolutionary rationality dictates this type of reaction.

Another characteristic of decision making is also related to the structure of the brain.
The brain cannot analyze a problem without preconceptions
(not to be confused with prejudices). When someone is confronted with a problem or a question, a preconception is utilized immediately, sometimes based and dependent on how the question is worded or the information presented. Thus, the reaction of a sick person to the doctor's prognosis that he has an 80 percent chance of recovering is different than his reaction to the news that he has a 20 percent chance of remaining sick for the rest of his life. The rational mind would recognize that an 80 percent chance of success is the same as a 20 percent chance of failure. A mind that acts only in an evolutionarily rational way will not recognize the equivalence. That recognition is not worth the effort needed to compare and analyze the facts in every case, and we therefore cling to concepts and attitudes already present in our brains. The popular belief that we can examine a question without preconceptions is incorrect.

Some time ago, one of Israel's past ministers for internal security proposed that a criminal case passed from the police to the state prosecutor should no longer be accompanied by a recommendation from the police whether or not the suspect should be charged, as it always had been until then. The minister said that it was up to the state prosecutor's department to make its own decision on the basis of the material it received, without preconceptions. The honorable minister was clearly unaware that there is no such thing as “without preconceptions.” If the police do not give their view regarding prosecution of the suspect, the discussion will start in the state prosecutor's department with a different preconception, almost certainly a less authoritative one, based for example on media reports.

Another behavioral pattern rooted in evolution is the default
to believe what you are told
. Most of us are familiar with the situation in which newspaper reports of events we ourselves witnessed are not exact or reliable. Yet if we read in the same newspaper a report of an incident whose accuracy
we are unable to verify, we believe what is written. Again, the reason is evolutionary rationality. It is not efficient to doubt what we are told, even if that sometimes leads to absurdities.

Here is a description of an experiment that, although it could be considered a quaint tale, teaches an interesting pattern of behavior. A group of monkeys was placed in a cage, in which there was a pole with a banana at the top. When one of the monkeys started climbing the pole to take the banana, all the monkeys were given a small electric shock. The monkeys soon learned when it was they felt the shocks, and whenever a monkey started to climb the pole the others would stop him by blows and threats. Then one of the monkeys was replaced by another one not from the cage. That one immediately tried to climb the pole to get to the banana but was stopped by the others so that they would not get an electric shock. After a while another fresh monkey replaced one of the group, and the same happened. The new member of the group tried to get to the banana and was prevented from doing so by the others. Among those hitting him was the monkey that had been brought in previously. After this process of replacing monkeys had been repeated several times, there were no monkeys in the cage that had ever received an electric shock. Moreover, the electrical current was disconnected. Yet whenever a monkey showed signs of intending to climb the pole to take the banana, the others fell upon him and beat him vigorously.

It is easy to identify such behavior among all human societies and among many groups of animals. The property is the result of evolution. A child who devotes himself to checking every instruction and advice his parents and teachers give him will not get very far. Although science allows, and even supposedly requires, every scientific theory to be treated skeptically and critically and every result to be checked, such doubts and checks are not part of students’ and researchers’ practice, even in mathematics! That is evolutionary rationality that is ingrained in our genes, and it is of course consistent with the efficiency of our day-to-day lives. Such behavior, however, is responsible for many serious ills in human society. Apparently the evolutionary advantage of the strategy of believing what we are told outweighs the drawbacks resulting from that strategy.

Another pattern deep-rooted in our brains is the belief that
what was, will be
. The default is not to believe any prophecies, and in particular prophecies of doom, about future changes. The reason, again, derives from evolution. Devoting time to what may occur and preparing oneself accordingly takes valuable means that are required in the evolutionary struggle, as we have mentioned previously in the hypothetical possibility that the dinosaurs could have grown gills to preempt the disastrous effect of the meteoric dust that led to their extinction. Acting according to the default that what was, will be, explains the great surprise with which we react to every abrupt change, although when changes occur it is easy to see that they could have been foreseen.

Yet another behavioral pattern derived from evolution that deviates from mathematical rationality can be seen from an experiment performed by Güth, Schmittberger, and Schwarze, reported in 1982. Two decision makers have to divide a hundred dollars between them in the following way. The first, let us call him A, must decide what share of the hundred dollars he wants to be left with and what share he will offer to the other, B, with the proviso that he must leave B at least one dollar. B can either accept what he is offered or reject the offer. If he rejects the offer, neither of them will receive anything. It is explained to them that this is a one-off, non-repeated game, and furthermore, to emphasize the point, neither knows against whom he is playing, and both are assured that this anonymity will be maintained. What amount is it worthwhile for A to offer, and how ought B to react? The second part of that question has a very clear rational answer (assuming that B prefers to receive a positive number of dollars rather than nothing). Whatever positive amount A offers B, it is worthwhile for B to accept. Hence, assuming such rationality, it is worthwhile for A to offer B one dollar. The argument is clear and is based on rationality that says to choose what is best for yourself. But it is not surprising that the participants in the experiment behave differently. A generally offers B an amount between forty and fifty dollars, and B generally rejects any offer less than forty dollars. The explanations given by the participants were largely related to arguments of
justice and fairness
. The search for justice is inherent through evolution, and the sense of justice has
been observed to exist even in day-old infants, as we mentioned at the end of section 3. The dependence on justice and fairness reflects evolutionary rationality and not logical rationality.

Another property that is sometimes not rational but reflects evolutionary rationality is the
protection of property
, that is, the tendency to keep property that has been acquired. This tendency certainly came about in the evolutionary process, when private property was essential to subsistence. Daniel Kahneman and his colleagues carried out a trial on this. Two groups of people are given the choice of a cup worth about ten dollars and a ten-dollar bill. The first group is given the choice between the cup and the bill, while the second group is given the cup as a gift when they come into the hall where the experiment is taking place, and they are then offered the ten-dollar bill in exchange for the cup. The fact that the cup is already in the possession of the second group had a strong effect on their choice. The subconscious desire to preserve their property, reflecting evolutionary rationality, overcomes simple logic and mathematical rationality. The reader will find more examples and broader discussion of this subject in the list of sources.

And at last we arrive at the question we asked at the beginning of the previous section. Actual human conduct often deviates from rational behavior in the sense that it contradicts generally accepted basic assumptions, accepted also by those behaving in that way. That said, can mathematical methods be used to describe human behavior? Is it possible to formulate a theory based on other basic assumptions through which, with the help of the usual mathematical rules of induction, it will be possible to analyze and understand human conduct? In my opinion, the answer is yes, but the basic axioms for such a theory have to be derived from characteristics embedded in humans in the evolutionary process.

Can people be taught to behave in a logically rational way, that is, to fulfill in practice those axioms on which they agree in theory? And if so, is it worth doing? My answer to those two questions is the same as the answer I gave to a similar question related to Bayes's law: in my opinion there is no possibility of instilling rational behavior into the brain when
that contradicts behavior that is consistent with evolutionary rationality, and it is also not worth doing. The advantages of behavior that developed during evolution still exist. Nevertheless, there are cases when it is important to reach a rational decision, when the results of a decision are fateful and could cause immense harm if the decision follows evolutionary rationality but not logical rationality. In those cases it is worthwhile to invest the time to look for the logically rational solution and not to base the decision on “gut feeling.”

Finally, there is serious debate about whether decision makers in key positions who make decisions with fateful outcomes rely on their gut feelings, or can we be confident that when all is said and done they arrive at their decisions based on mathematical analysis? For instance, can we rely on a leader who must decide whether to start a war to analyze the situation calmly, perhaps subjectively, but from our point of view, rationally? It seems that that is an open question.

Why is an abacus so called? • How did weaving looms help the computer world? • How many computers does the world need? • How can you win a million dollars through sudoku? • How should the winning numbers in the national lottery be encoded? • Can a computer think? • Can a computer impersonate a human? • What do genetics and mathematical calculations have in common?

Other books

Her Tender Tyrant by Elizabeth Lennox
Rising Tiger by Trevor Scott
Darkest Mercy by Melissa Marr
Cobwebs by Karen Romano Young
A Father's Promise by Carolyne Aarsen
Compulsion by Hope Sullivan McMickle
No Hurry in Africa by Brendan Clerkin
Tainted Lilies by Becky Lee Weyrich
She's Mine by Sam Crescent