Liars and Outliers (39 page)

Read Liars and Outliers Online

Authors: Bruce Schneier

BOOK: Liars and Outliers
3.62Mb size Format: txt, pdf, ePub

From the
Yeyi of Botswana
: “When staying in a happy community be happy: when staying in a sad community be sad,” and “It's the termites which cause the tree to fall down”—basically, minor disputes undermine the strength of the community.

(8)
It's also been demonstrated that people who
believe in free will
are less likely to cheat on tests or slack off on the job than those who
believe in predestination
. No one is sure why: perhaps believing that you don't have a choice in what you do undermines a person's sense of integrity, or perhaps it just provides a convenient excuse for giving in to selfish temptations, as if they were an unavoidable destiny. Predictably, individuals who embrace the
concept of free will
are also more likely to hold other people responsible for their own actions, which in turn makes them more likely to punish defectors. I'm not saying that the concept of free will is innate, or that it evolved as a societal pressure system, but it seems to function as one.

(9)
Hauser is a discredited academic. Harvard recently found him guilty of scientific misconduct; a paper has been retracted, and he's currently on leave and is no longer allowed to teach. Even so, his book has a lot of interesting insights into human moral systems.

(10)
Inbreeding is likely
to result in recessive genetic disorders, making individuals less viable. This is why cheetahs, being so inbred because of how close to extinction they came at some point in their history, have such a high disease rate: there's just not enough variety in the gene pool: Amish, too.

(11)
The game was the Ultimatum game (see note 14 in Chapter 3 for a full description). The goal was to find people
isolated from modern
society, and the Machiguenga tribe fit the bill. What Henrich found was that the first player tended to make what we would consider unfair divisions—85%/15% or so—and the second player would accept them. By contrast, people from modern societies playing the same game tend to reject such unbalanced divisions. In post-game interviews, Machiguenga subjects told him they would accept any offer. It's not that they were more unwilling than their more urbane counterparts to either be unfair or to accept unfairness, but that they considered the unfairness to have occurred at the point where the first and second player were chosen. That is, first players considered themselves lucky to have been chosen as first players, and second players thought it bad luck to have been chosen as second players. Once they accepted their positions, the division wasn't tainted with notions of fairness. The minority of tribesmen who responded to the game in a manner more similar to players from industrialized societies were those who had spent the most time interacting with people beyond their tribe.

(12)
Believe it or not, there are security systems to help ensure that employees
wash their hands
before leaving the restroom, mostly involving hand stains that don't come out without vigorous washing.

(13)
The phrase “bad apple” has been misused recently. More and more, it's used to mean isolated wrongdoers whose actions don't affect anyone else in the group. The entire phrase is “one bad apple spoils the entire bunch,” and is intended to explicitly highlight how the reputation of one person can taint the reputation of all people in the group. Incidentally, this is actually true for apples stored in a root cellar. A spoiled apple will cause the rest of the apples to spoil.

(14)
The logical extreme of this idea is the “
broken windows theory
” of John Q. Wilson and George Kelling, that visible signs of criminal activity like broken windows and abandoned cars actually incite people to commit crimes. Wilson and Kelling believed that if you clean up these visible signs of lawlessness, a neighborhood will become safer overall; societal pressures against petty crime will cause a reduction in violent crime.

It sounds good, and Kelling used the theory to explain the dramatic drop in crime in New York City in the 1990s, but it turns out there's not much actual evidence that it's true.
Researchers compared
New York City and other cities, and found that New York's punitive measures against low-level visible lawlessness—a lot of which might be considered punitive measures against homelessness—didn't make much of a difference. It's not that this effect doesn't exist at all—there is
evidence that it does
. It's that other causes of crime are more important, and focusing societal pressure on low-level criminal activities in the expectation that it will prevent other crimes is much less effective than directly preventing those other crimes.

Economist
Steven Levitt
looked at the reduction of crime across the U.S. in the 1990s and concluded: “Most of the supposed explanations…actually played little direct role in the crime decline, including the strong economy of the 1990s, changing demographics, better policing strategies, gun control laws, concealed weapons laws and increased use of the death penalty. Four factors, however, can account for virtually all of the observed decline in crime: increases in the number of police, the rising prison population, the waning crack epidemic and the legalization of abortion.”

(15)
A recent study of 75,000 households served by the Sacramento Municipal Utility District and Puget Sound Energy found that customers who received peer comparison charts
reduced their energy
usage by an average of 1.2% to 2.1%, a change that was sustained over time. Of course, this isn't absolute. There are people who don't care, or don't care enough to make changes in their behavior—and there is evidence that
this system backfires
with some conservatives. Even so, enough
people are swayed
into cooperation by the comparison charts to make them an effective societal pressure system.

(16)
In Rwanda
, marriages between members of the Hutu and Tutsi ethnic groups are common. But when extremist Hutus came to power in the early 1990s, they pushed an increased stigmatization of Rwandese in mixed marriages. In the new Hutu morality, Tutsi women were vilified as immoral temptresses, and the men who succumbed to their charms were viewed as traitors.

(17)
We tend to empathize more with people suffering from acute problems than those with chronic need. Witness the outpouring of aid to the Indian Ocean tsunami victims of 2004 versus the aid given annually for things like malnutrition.

(18)
The
cash box was made
of wood, with a slot for money. Initially Feldman used an open basket of money, but some people took the money. He then tried a coffee can with a lid, but people stole from that, too. A locked wooden box is enough of a deterrent. The only way to take the money is to steal the box itself, which only happened about once a year.

There are
a host of unknowns
in these data. Did everyone pay 90%, or did nine in ten pay full price and one in ten pay nothing? This sort of honor system offers many ways to partially defect. Still, it offers interesting insights into how moral pressure works. As prices rose, the payment rate fell. This makes sense: as the financial benefit of non-payment increased, some people who were just barely on the side of cooperation were willing to overcome the moral prohibition against theft. Data from the number of bagels eaten showed that price-sensitive customers were more likely to defect than more consistent consumers. This also makes sense. People who purchased donuts—he started bringing them in, too—were more likely to underpay than people who purchased bagels. Maybe this meant that donut eaters were less cooperative than bagel eaters, although it might have had something to do with the perceived price versus value of the two items, or the fact that donuts are considered junk food whereas bagels are not. And there was a sharp and persistent increase in payment following the 9/11 terrorist attacks, in line with the in-group loyalty effects I talked about earlier.

Chapter 8

(1)
Researchers have used the Prisoner's Dilemma to study this.
People who defect
predict a 76% defection rate from other players, and people who cooperate predict a 68% cooperation rate. Put in layman's terms, people reflexively think others are like themselves. More interestingly, in one experiment, people were asked to predict the behavior of other players after chatting with them for half an hour. Then, people were
better at predicting
who would cooperate and who would defect. In another experiment, players were asked to evaluate the intentions of their opponents at various points during a multi-round Prisoner's Dilemma game. Cooperative players were
better at recognizing
other cooperative players; defecting players regularly mischaracterized cooperative players as defecting. This isn't surprising since people tend to see themselves in others.

(2)
Reputation mattered in the various “game” experiments mentioned in Chapter 3: the Ultimatum game, the Dictator game, the Public Goods game, and so on. Subjects were more altruistic, more fair, and more cooperative when their actions were known to the researchers or when they met the other players, and less so when they were anonymous and alone in a room.

(3)
In 1984, political scientist
Robert Axelrod
studied an iterated Prisoner's Dilemma. He set up a computer tournament and invited academic colleagues from all over to compete against each other. What he found was interesting, and in hindsight fairly obvious. Successful strategies had four basic characteristics:

They were altruistic—Axelrod used the word “nice”—in that they did not defect before their opponent did.

They were retaliatory, and responded to defection with defection.

They were forgiving, and would cooperate again at some later point.

They were non-envious; their goal wasn't to outscore their opponent.

The most successful strategy—called “tit-for-tat”—was extremely simple. A tit-for-tat player would first cooperate, then mirror his opponent's previous move. If his counterpart cooperated in a round, then tit-for-tat would cooperate in the next. If his counterpart defected in a round, then tit-for-tat would defect in the next. If two tit-for-tats competed, they would both cooperate forever. Essentially, Axelrod discovered reputation.

(4)
The
oft-quoted line
is that the average dissatisfied customer will tell 9–10 of his friends, and that 13% will tell 20 or more people. On Facebook, they'll tell everyone they know; and on Yelp, they'll tell everyone they don't know. Of course, there's a difference between reputation learned firsthand and reputation learned secondhand, similar to the personal and impersonal trust discussed in Chapter 1.

(5)
Target stores used to go so far as to accept returns of items they knew weren't purchased at Target. They calculated it was better to accept the return than argue with the customer about where the item was purchased. They no longer do this; presumably too many defectors took advantage of the system.

(6)
Prisoner's Dilemma experiments confirm that when players know each other's reputations—instead of being anonymous—
cooperation jumps
from around 50% to around 80%.

(7)
Dueling isn't always
irrational; an economic analysis of the practice demonstrates that it made sense, given the reputational realities of the day. Similarly, the deadly defense of reputations that occurs in the criminal underworld also makes economic sense.

(8)
Chimpanzees are able
to learn about the reputation of others by eavesdropping on third-party interactions, but they do not directly communicate with each other about the reputation of other chimpanzees.

(9)
The Islamic notion of
ihsan
—that people should do right because God is always watching their thoughts and deeds—is relevant here. Pascal's Wager takes this view to a somewhat cynical conclusion: it's better to cooperate (believe in God, follow God's rules, and so on) than to defect, because the potential downside of defecting is so great.

(10)
Better yet, do good and let someone find out about it surreptitiously, as British essayist
Charles Lamb
commented: “The greatest pleasure I know, is to do a good action by stealth, and to have it found out by accident.”

(11)
There is
counter-evidence
as well. In some circumstances, diversity seems to enhance cooperation. Eric Uslaner disputes Putnam's thesis, and argues that
diverse communities
can be more cooperative because people living in them are more likely to accept strangers into their “moral community.” Clearly more research is required.

(12)
Two people living on opposite sides of the same Norwegian fjord would have spoken
different dialects
. Until recently, and possibly even still, it has been possible to identify the birthplace of native Britons to within 30 miles solely by their
English dialects
.

(13)
Anthropologist
David Nettle ran
an interesting simulation, along similar lines to the Hawk-Dove game. He set up an artificial world where cooperation was necessary for survival, and individuals could choose whom they wished to cooperate with. When he allowed individuals to cooperate only with others who spoke the same dialect, hawks were kept down to a much smaller percentage of the total population than when dialect wasn't a factor. None of this is very surprising; we already know that reciprocity based on proximity is one of the ways cooperation can evolve in a species. Most interestingly, Nettle found that this system of using dialects as societal pressure worked best when they changed rapidly from generation to generation. The simulation mirrored the manner in which these changes occur in life; historically, there are clear differences in human dialects over only a few generations.

(14)
We also try to adopt other cultural norms, to seem less foreign to others. We hand our business cards carefully with two hands to Japanese colleagues, and drink beer with German colleagues even if we prefer wine.

Other books

The Eagle's Covenant by Michael Parker
Inferno by Julian Stockwin
Damned if I Do by Erin Hayes
Woof at the Door by Laura Morrigan
Wedding Girl by Stacey Ballis
Growing New Plants by Jennifer Colby
Undercover Father by Mary Anne Wilson