The Wisdom of Psychopaths (14 page)

BOOK: The Wisdom of Psychopaths
5.47Mb size Format: txt, pdf, ePub
ads

Figure 3.3. The Prisoner’s Dilemma

This officer is smart. Think about it. He has, in effect, made you an offer you can’t refuse. The truth of the matter is simple. Whatever your partner may choose to do, you are always better off confessing. If your partner decides to keep his mouth shut, then you either face a year in the slammer for doing the same thing or stroll off into the
sunset by informing against him. Similarly, if your partner decides to inform against you, then you either go down for the full term for deciding to hold out or halve your sentence by mirroring their betrayal. The reality of both of your predicaments is freakishly paradoxical. Logically speaking, self-preservation dictates that the only sensible course of action is to confess. And yet, it’s this same paralyzing logic that robs you both of the chance of minimizing your joint punishment by remaining silent.

And note that the question of probity—remaining tight-lipped because it’s the “right” thing to do—doesn’t come into it. Quite apart from the dubious moral worth of placing oneself in a position that is self-evidently prone to exploitation,
the whole purpose of the Prisoner’s Dilemma is to ascertain optimal behavioral strategies not within frameworks of morality, with philosophical enforcers working the doors, but within a psychological vacuum of zero moral gravity … such as that which comprises the natural world at large.

So could the psychopaths be right? Could it really be survival of the fittest out there? Such a strategy, it would seem, is certainly logical. In a one-off encounter such as the Prisoner’s Dilemma, you might argue that dog-eat-dog (or a strategy of defection, to use the official terminology) constitutes a winning hand. So why not, in that case, just go ahead and play it?

The reason, of course, is simple. Life, in its infinite complexity, doesn’t go in for one-offs. If it did, and the sum total of human existence was an endless succession of ships passing in the night, then yes, the psychopaths among us would indeed be right, and would quickly inherit the earth. But it isn’t. And they won’t.
Instead, the screen of life is densely populated with millions upon millions of individual pixels, the repeated interaction of which, the relationships between which, gives rise to the bigger picture. We have histories—social histories—with each other. And we are able, unlike the characters in the Prisoner’s Dilemma, to communicate. What a difference that would have made! But that’s okay—just as we are able to play the Prisoner’s Dilemma the one time, so we can play it a number of times. Over and over. Substituting prison terms for a system of reward
and punishment in which points are won or lost (see
figure 3.4
), we are able, with the aid of some simple mathematics, to simulate the complexity of real life, in exactly the same way as we did with Jim and Buzz.

Figure 3.4. A sample Prisoner’s Dilemma game

What happens then? Do the psychopaths cut it in a world of repeated encounters? Or is their strategy trumped by simple “safety in numbers”?

Saints Against Shysters

To answer this question, let’s imagine a society slightly different from the one we currently live in: a society like that of days gone by, in which the workforce is paid in cash at the end of each week, in personalized little brown envelopes. Now imagine that we can divide this workforce into two different types of people. The first type is honest and hardworking and puts in a full week’s work. Let’s call them the saints. The other is dishonest and lazy and preys upon its diligent counterparts as they make their way home on a Friday, lying in wait outside the factory gates and appropriating their hard-earned wages for themselves. Let’s call them the shysters.
5

At first it would seem as if the shysters have got it made: that crime pays. And, indeed, in the short term at least, it does. The saints clock in to keep the community going, while the shysters reap a twofold benefit. Not only do they enjoy the advantages of living in a flourishing society, they also, by stealing the saints’ wages, “get paid” for doing nothing. Nice work if you can get it. But notice what happens if the pattern of behavior continues. The saints begin to tire and fall sick. Having less disposable income with which to look after themselves, they begin to die out. Gradually, the ratio of the “working” population starts to shift in favor of the shysters.

But this, of course, is precisely what the shysters don’t want! With the number of saints diminishing by the week, the likelihood increases that the shysters will encounter each other. Moreover, even if they do run into a saint, there’s a greater chance that they’ll come away empty-handed. Another shyster may well have beaten them to it.

Eventually, if the fun and games are allowed to play out naturally, the power balance comes full circle. The pendulum swings back in favor of the saints, and society reverts to working for a living. But note how history is programmed to repeat itself. The saints call the shots for only such time as the economy is in recession, and the shysters preside for only as long as the saints can keep them afloat. It’s a bleak carousel of recurring boom and bust.

This brief sketch of two very different work ethics is, to say the least, a simplified representation of an infinitely more complex set of dynamics. Yet it is precisely this simplification, this behavioral polarization, which lends such a model its power. Pure unconditional aggression and pure unconditional capitulation are destined to fail as strategies of social exchange in a society of multiple interaction and mutual dependence. In what essentially amounts to a peripatetic seesaw effect, each strategy is vulnerable to exploitation by the other once one has gained the ascendancy: once the proponents of one strategy become enough of a mob to be parasitized by those of the competing strategy. To coin a phrase from the sociobiology lexicon: as strategies for survival, neither unqualified cooperation nor unqualified competition may be regarded as evolutionarily stable.
6
Both may be trumped by invading or mutating counterstrategies.

including, in some cases, the queen, in order to appropriate their honey. Hives protect themselves from robbers by appointing guard bees on sentry duty at the hive entrance, to be on the lookout for the raiders and to fight them to the death in the event of an attack. In a study just out, however, a combined team of researchers from the University of Sussex in the U.K. and the University of São Paulo in Brazil has just uncovered the world’s very first “soldier” bee. This subcaste of the Jatai bee (
Tetragonisca angustula
), unlike normal guard bees in honeybee colonies, is physically specialized to perform the task of protecting the hive. It is 30 percent heavier than its forager nest mates, and has larger legs and a smaller head. Perhaps they should call it the “berserker bee.” (See Christoph Grüter, Cristiano Menezes, Vera L. Imperatriz-Fonseca, and Francis L. W. Ratnieks, “A Morphologically Specialized Soldier Caste Improves Colony Defense in a Neotropical Eusocial Bee,”
PNAS
109, no. 4 (2012):1182–86. doi:10.1073/pnas.1113398109.)

But can we actually observe this iterative process in action, this repeated unfolding of the Prisoner’s Dilemma dynamic? We are, after all, firmly in the realm of a thought experiment here. Do these abstract observations pan out in real life? The answer depends on what we mean by “real.” If in “real” we’re prepared to include the “virtual,” then it turns out we might be in luck.

Virtual Morality

Suppose I were conducting an experiment on people’s responses to the unexpected and I presented you with the following opportunity: For a thousand dollars, you must take off all your clothes and walk, stark naked, into a bar to join a group of friends. You must sit at a table and talk to them for five minutes (that’s two hundred dollars a minute!), during which time you will feel the full force of the excruciating social embarrassment that will undoubtedly accompany the venture. However, after the five minutes have elapsed, you will leave the bar unscathed, and I will ensure that neither you nor anyone else who was present will have any memory of the event. I shall erase it all. Apart from the crisp bundle of notes burning a hole in your pocket, it will be as if nothing had ever happened.

Would you do it? In fact, how do you know you haven’t done it already?

There are some people, I’m sure, who would gladly bare all for the sake of scientific advancement. How liberating it would be if somehow, somewhere, in the terraces and tenements of time, we could check in and out of a transient, encapsulated world where experiences are rented by the hour. This, of course, was very much the theme of
The Matrix
: humans inhabiting a virtual world, which appeared, at the time, compulsively, compellingly real. But what of the flip side? What of computers inhabiting a world that is human?

In the late 1970s, the political scientist Robert Axelrod asked exactly this question in relation to the Prisoner’s Dilemma—and hit upon a method of digitizing the paradigm, of determining a strategy, over time and repeated interaction, that ticked all the boxes of evolutionary stability. He sequenced the genome of everyday social exchange.

First, Axelrod approached a number of the world’s leading game theorists about the idea of holding a Prisoner’s Dilemma tournament in which the sole participants were computer programs. Second, he urged each theorist to submit a program to take part in the tournament that embodied a set, prespecified strategy of cooperative and competitive responses. Third, once all the submissions had been received (there were fourteen of them in all), he set up a preliminary round prior to the commencement of the contest’s main event, in which each of the programs competed against the others for points. At the conclusion of this round, he added up the number of points that each program had accrued, and then kicked off the tournament proper, with the proportion of programs represented corresponding to the number of points that each had amassed in the preceding round—precisely in line with the strictures of natural selection. Then he sat back and watched what happened.

What happened was pretty straightforward. The most successful program by far was also, by far, the most simple. TIT FOR TAT, designed by the Russian-born mathematician and biologist Anatol Rapoport, whose pioneering work on social interaction and general systems theory has been applied to issues of conflict resolution and disarmament not just in the lab but on the political stage at large, did exactly what it said on the label. It began by cooperating, and then exactly mirrored its competitor’s last response. If, on trial 1, for example, that competitor also cooperated, then TIT FOR TAT would continue to follow suit. If, on the other hand, the rival program competed, then on subsequent trials it got a taste of its own medicine … until such time as it switched to cooperation.

The graceful practicality and resilient elegance of TIT FOR TAT soon became apparent. It didn’t take a genius to see what it was up to. It embodied, spookily, soullessly, in the absence of tissues and synapses, those fundamental attributes of gratitude, anger, and forgiveness that make us—us humans—who we are. It rewarded cooperation with cooperation—and then reaped the collective benefits. It took out immediate sanctions against incipient competition, thus avoiding the reputation of being a soft touch. And in the aftermath of such rancor, it was able to return, with zero recrimination, to a pattern of mutual back-scratching, nipping in the bud any inherent potential for protracted, destructive, retrospective bouts of sniping. Group selection, that hoary evolutionary chestnut that that which is good for the group is preserved in the individual, didn’t come into it. If Axelrod’s experiment showed us anything at all, it was this: altruism, though undoubtedly an ingredient of basic group cohesion, is perfectly capable of arising not out of some higher-order differential such as the good of the species or even the good of the tribe, but out of a survival differential existing purely between individuals.

Macroscopic harmony and microscopic individualism were, it emerged, two sides of the same evolutionary coin. The mystics had missed the point. Giving wasn’t better than receiving. The truth, according to Robert Axelrod’s radical new gospel of social informatics,
was that giving
was
receiving. And, what’s more, there was no known antidote. Unlike our earlier example of the saints and the shysters, in which a “tipping point” kicked in once the high end of the population seesaw assumed a certain level of ascendancy, TIT FOR TAT just kept on rolling. It was able, over time, to sweep all competing strategies off the field permanently. TIT FOR TAT wasn’t just a winner. Winning was just for starters. Once it got going, it was pretty much invincible.

Best of Both Worlds

Axelrod’s adventures in the world of “cybernethics” certainly raised a few eyebrows. Not just among biologists, but in philosophical circles, too. To demonstrate so convincingly that “goodness” was somehow inherent to the natural order, that it was an emergent property, as it were, of social interaction, succeeded only in driving an even bigger wedge between those on the side of God and those who put God to one side. What if our “better” nature wasn’t better, after all? But was, instead … well, just nature?

BOOK: The Wisdom of Psychopaths
5.47Mb size Format: txt, pdf, ePub
ads

Other books

Undead Genesis: Zombie by Colten Steele
L. Frank Baum_Oz 14 by Glinda of Oz
Balance of Trade by Sharon Lee, Steve Miller
Summer Heat by Jaci Burton
Tapestry by Fiona McIntosh
Banged Up by Jeanne St James
Silver Dew by Suzi Davis
D is for Deadbeat by Sue Grafton