Liars and Outliers (17 page)

Read Liars and Outliers Online

Authors: Bruce Schneier

BOOK: Liars and Outliers
5.21Mb size Format: txt, pdf, ePub

The converse of penalties are incentives: rewarding someone for cooperating. There is a whole class of institutional pressure systems designed to reward cooperative behavior. Examples include:

  • Tax deductions or tax credits for certain behaviors.
  • Faster tax refunds for people who file their returns electronically.
    8
  • Time off a prison sentence for good behavior.
  • Employee bonuses.
  • Bounties and rewards for turning in wanted fugitives.
  • Deferred or non-prosecution of SEC violations as an incentive to provide evidence against other, larger, violators.
  • Certifications, both coercive ones (FDA approval for new drugs) and optional ones (LEED certifications for buildings).
  • Whistle-blower statutes, where the whistle-blower gets a percentage of the fraud found.

The problem with rewarding cooperators via an institutional mechanism is that it's expensive. If we assume that the majority will cooperate regardless of the reward, then a lot of people will get a reward for doing what they were going to do already. Either the reward will have to be very small and not much of an additional incentive to cooperate, or the total cost of rewarding everyone will be very expensive. In general, it's more efficient to spend that money going after the minority of defectors.

Financial incentives and penalties interact weirdly with other categories of societal pressures. It's easy to regard societal pressures as cumulative—and to assume that moral plus institutional pressure will be more effective than morals alone—but our moral systems are more complicated than that.

In one experiment
, participants were presented with a societal dilemma: they were in charge of a manufacturing plant that emitted toxic gas from its smokestacks. They could either spend more money to clean up a lot of the toxin, or spend less money to clean up a little bit of the toxin. The dilemma came from the fact that pending government legislation—a bad thing in the experiment's scenario—depended on how much cleaning up the manufacturing plants did collectively. It's a free-rider problem: a subject could either cooperate and clean up his share, or defect and hope enough others cleaned up enough to forestall legislation.

What makes this experiment particularly interesting is that half of the subjects were also told that the industry would be inspecting smokestacks to verify compliance and fining defectors. It wasn't a big risk; both the chance of inspection and the cost of noncompliance were low. Still, inspections are a societal pressure, and you'd expect they would have some positive effect on compliance rates. Unexpectedly, they had a negative effect: subjects were more likely to cooperate if there were no noncompliance fines than if there were. The addition of money made it a financial rather than a moral decision. Paradoxically, financial penalties intended to discourage harmful behavior can have the reverse effect.

For this reason, signs featuring anti-littering slogans like “Don't Mess with Texas” are more effective than signs that only warn, “Penalty for Littering: $100”; and “smoking in hotel rooms is prohibited” signs are more effective than signs that read “$250 cleaning penalty if you smoke.” In one experiment with day care providers, researchers found that when they
instituted a fine
for parents picking their children up late, late pickups increased. The fine became a fee, which parents could decide to pay and assuage any moral resistance to defection.

More generally, the very existence of rules or laws can counter moral and reputational pressure. Some
towns are experimenting
with eliminating all traffic laws and signs. The idea is that drivers who must follow the rules pay less attention to the world around them than drivers with no rules to follow.

Financial rewards have the same effect that financial penalties do; they engage the brain's greed system and disengage the moral system. A fascinating
incident in Switzerland
illustrates this. Trying to figure out where to put a nuclear waste dump, researchers polled residents of several small towns about how they would feel about it being located near them. This was 1993, and a lot of fear surrounded the issue; nonetheless, slightly more than half of the residents agreed to take the risk, for the good of the country.

In order to motivate the other half, the researchers offered money in exchange for siting the nuclear dump near them: about $2,000 per person per year. Instead of enticing more residents to accept the dump, it reduced their number by half. The researchers doubled and then tripled the amount offered, but it didn't make a difference. When they simply asked nicely, the researchers stimulated the altruistic part of the residents' brains—and, in many cases, they decided it was the right thing to do. Again, the addition of money can increase the defection rate.
9

Financial advisors exhibit this unconscious bias in favor of their clients. In one experiment, analysts
gave different weights
to the same information, depending on what the client wanted to hear. An obvious societal pressure system to address this problem would be to
require advisors
to disclose any conflicts of interest; but this can have the reverse effect of increasing the number of defectors. By disclosing their conflicts, financial advisors may feel they have been granted a moral license to pursue their own self-interest, and may feel partially absolved of their professional obligation to be objective.

Elinor Ostrom received a Nobel Prize in 2009 for studying how societies deal with Tragedies of the Commons: grazing rights in the Swiss Alps, fishing rights off the coast of Turkey, irrigation communities in the Philippines. She's studied commons around the world, and has a list of rules for successfully managing them.
10
Generalizing them to our broad spectrum of societal dilemmas, they serve as a primer for effective institutional pressure:

1.
Everyone must understand the group interest and know what the group norm is.

2.
The group norm must be something that the group actually wants.

3.
The group must be able to modify the norm.

4.
Any institution delegated with enforcing the group norm must be accountable to the group, so it's effectively self-regulated. We'll discuss these institutions in Chapter 14.

5.
The penalties for defecting must be commensurate with the seriousness of the defection.

6.
The system for assessing penalties must be consistent, fair, efficient, and relatively cheap.

7.
The group must be able to develop its own institutional pressure and not have it imposed from outside the group.

8.
If there are larger groups and larger group interests, then the groups need to be scaled properly and nested in multiple layers—each operating along these same lines.

Ostrom's rules may very well be the closest model we have to our species' first successful set of institutional pressures. They're not imposed from above; they grow organically from the group. Societies of resource users are
able to self-regulate
if they follow these rules, and that self-regulation is stable over the long term. It's generally when
outsiders come in
and institutionalize a resource-management system that things start to fail.

I mentioned institutional pressure as a formalization of reputational pressure. This works in several ways. Laws formalize reputation itself. In Chapter 8, we talked about group membership as a substitute for individual reputation. As societies grow, laws formalize some group memberships.

For example, doctors need a license to practice. So do architects, engineers, private investigators, plumbers, and real estate agents. Restaurants need licenses and regular inspections by health officials to operate. The basic idea is that these official certifications provide a basis for people to trust these doctors, private investigators, and restaurants without knowing anything about their reputations. Certification informs potential clients that a businessperson has at least the minimum formal education and skill required to safely and competently perform the service in question, and that the businessperson is accountable to someone other than the customer: a licensing body, a trade organization, and so on. Handicap license plates are another formalized reputational system. Not all certifications are controlled by the government; some come from private institutions, such as Underwriter's Laboratories' certifications, the Good Housekeeping Seal of Approval, Consumer Reports rankings, and a variety of computer standards.

Other formal memberships that serve as reputation substitutes include academic degrees, bar associations for lawyers, the Better Business Bureau, food products' labels of origin—
appellation d'origine contrôlée
in France, and U.S. counterparts like “Wisconsin cheese” and “Made in Vermont”—USDA Organic certification, consumer credit ratings and reports, bonding, accreditation of educational institutions.

Negative reputation can also be institutionalized: public sex-offender registries, the DHS terrorist “no fly” list, blacklists for union organizers or suspected Communists, and designations on driver's licenses of a felony conviction.
The scarlet letter
is an older example, and the yellow star the Nazis required Jews to wear is a particularly despicable one.

Laws also formalize commitment. Legal contracts are probably the best example. Marriage licenses, curfew laws, and laws that enforce parents' commitment to raise their children are others.

Societal Dilemma: Following contracts.
Society: Society as a whole.
Group interest: Effectively formalize agreements.
Competing interest: Maximize some self-interest.
Group norm: Follow contracts.
Corresponding defection: Break contracts.
To encourage people to act in the group interest, the society implements a variety of trust mechanisms.

Moral: We feel good about keeping our word.

Reputational: No one does business with individuals and companies with a reputation for breaking contracts.

Institutional: There are all sorts of laws regarding the legality of contracts, and sanctions for breaking them.

Finally, laws formalize societal norms that reputation traditionally enforced: anti-incest laws and age-of-consent laws, minimum drinking ages, bans on false advertising, blue laws, public indecency/intoxication laws, city lawn and weed ordinances, noise ordinances, libel and slander laws, zoning regulations, laws against impersonating police officers, and—in a perverse way—laws prohibiting people from criticizing the government. Employment applications that ask if you have ever been convicted of a felony are a formalization of reputation.

All of these institutional pressures allow reputation to scale, by giving people a system to trust so they don't have to necessarily trust individuals. If I trust the system of government-issued identification cards and driver's licenses, I don't have to wonder whether to trust each and every a person when he tells me he's old enough to drink in my bar.

There are many ways institutional pressure fails:

There is too little or too much of it.
We've seen how institutional pressure is required to augment moral and reputational pressures in large and complex societies. Too little institutional pressure and the scope of defection is too great. For example, there's more tax evasion if the crime goes unpunished.

But more institutional pressure isn't always better. Gary Becker won a Nobel Prize in economics in part for his work in criminology. He asked the obvious question, what's the optimal level of crime? The naïve answer is zero, but that is unattainable and requires so much institutional pressure that society falls apart. Too much institutional pressure, and you get a country that looks like North Korea or the former East Germany: police states with a good part of the population working for the police. The other extreme—no police—doesn't work, either. You get lawless countries like Afghanistan and Somalia. Somewhere in the middle is the optimal scope of defection and the optimal level of enforcement.

Other books

Deadly Diamonds by John Dobbyn
First Stop, New York by Jordan Cooke
Star Soldiers by Andre Norton
Crash and Burn by Allison Brennan, Laura Griffin
Dead Time by Tony Parsons
Salt by Helen Frost
Blackwood Farm by Anne Rice