Read The Belief Instinct: The Psychology of Souls, Destiny, and the Meaning of Life Online
Authors: Jesse Bering
Tags: #General, #Psychology, #Religion, #Spirituality, #Body; Mind & Spirit, #Cognitive Psychology, #Personality, #Psychology of Religion
In her unpublished doctoral dissertation, one of my PhD students, Bethany Heywood, found similar evidence of atheists’ covert believing tendencies when they were asked to think about the major turning points in their own lives. Throughout 2009, Heywood conducted a series of online interviews using Instant Messenger with thirty-four atheists and thirty-four believers, including American and British samples. To prevent them from answering dishonestly (because many atheists bristle at even the insinuation that they’re irrationally minded and superstitious), the participants were given the impression that the study was about their memory of personal life events, otherwise known as “autobiographical memory.” In fact, Heywood wasn’t so much interested in the respondents’ memory skills as in their subjective interpretations of why these particular things had happened to them. In her careful analysis of their responses to her questions—such as, “Did you learn any lessons from this experience?” “How has this event changed your life?” “Looking back, are you better able to understand why this event happened to you than you were at the time?”—Heywood found that about two-thirds of the atheists had made at least one response betraying their implicit view that “everything happens for a reason.” As expected, more believers gave such responses—often noting the suspicion that their most trying times were actually God’s creative handiwork. But the overall difference in ascribing some inherent reason or purpose to momentous, life-altering events was, curiously, statistically negligible between the two groups.
38
A typical example of the atheist’s reasoning in this manner is shown in a response given by a British undergraduate student who considered herself to be an unflinching nonbeliever, the type that might keep surplus “Darwin fish” bumper stickers in her knapsack. One of the major turning points in her life, this student said, was failing an important university course and losing her prestigious scholarship. It changed everything. But when she was asked why this had happened to her—an ambiguous question that appealed to her poor study habits, challenges at home, or ineptitude as much as anything else—she answered, “So that I could see that even if I failed a course, my life wouldn’t actually end.”
39
Other atheists confessed that they often caught themselves thinking in such a fashion too, but then immediately corrected this subjective, psychological bias in line with their explicitly logical beliefs. One was a middle-aged man who had recently botched a job interview for a position that he very much wanted, failing to get an offer from his prospective employer: “And I found myself thinking: maybe this is meant to happen so I can find a better job or move to a different country to work—something like that. But in reality I don’t believe in fate, so it’s strange to find oneself thinking like that.”
40
Bethany Heywood’s group of religious high-functioning autistics that we met in Chapter 3, by contrast, failed to even comprehend the teleo-functional pretext underlying these types of questions. For example, here is an exchange between Heywood (
BH
) and a man (
JD
) with Asperger’s syndrome, a type of high-functioning autism:
BH
: “Do you ever think you see meaning in events that are seemingly coincidental?”
JD
: “Yes, sometimes. I’m sorry I’m not sure I understand the question fully.”
BH
: “I’m just wondering if you ever think there’s more to coincidences than it seems like?”
JD
: “Yes, to a certain degree, like someone says something to me and then later on someone else says something that is virtually the same, kind of like déjà-vu.”
And another Asperger’s respondent (
TM
):
BH
: “Do you ever think there’s more to coincidences than it seems like? Or do you just think coincidences are coincidences?”
TM
: “If they’re coincidences, by definition they are unrelated. But sometimes people mistake things for coincidence that actually are a pattern. Like people taking an unexpected dislike to me.”
BH
: “Do you ever see a pattern in coincidences?”
TM
: “No. By definition coincidences have no pattern.”
Some of the more introspective, and presumably non-autistic, novelists in modern literature have also puzzled over this bizarre cognitive push to tease apart the imaginary strands of such a shadowy web of purpose. In Milan Kundera’s first novel,
The Joke
(1967), the atheistic protagonist finds himself, duly surprised, ensnared in the fiber of this very web:
For all my skepticism, some trace of irrational superstition did survive in me, the strange conviction, for example, that everything in life that happens to me also has a sense, that it means something, that life speaks to us about itself through its story, and that it gradually reveals a secret, which takes the form of a rebus whose message must be deciphered, that the stories we live comprise the mythology of our lives and in that mythology lies the key to truth and mystery. Is it an illusion? Possibly, even probably, but I can’t rid myself of the need continually to decipher my own life.
41
This pattern of thinking strongly implies that atheism is more a verbal muzzling of God—a conscious, executively made decision to reject one’s own intuitions about a faceless übermind involved in our personal affairs—than it is a true cognitive exorcism. The thought might be smothered so quickly that we don’t even realize it has happened, but the despondent atheist’s appeal to some reasonable, just mind seems a psychological reflex to tragedy nonetheless.
This doesn’t make us weak, ridiculous, or even foolish. It just makes us human. And, as we’re about to see, it may make us particularly well-adapted human beings—at least, in the evolutionary sense of the term.
6
GOD AS ADAPTIVE ILLUSION
F
OR MILLIONS OF
years before the evolution of theory of mind, our human ancestors were just like other social primates—namely, impulsive, hedonistic, and uninhibited. This isn’t a character judgment against them; it’s just what worked for them in maximizing their reproductive success, just as it does for most modern-day social species. It would also be rather hypocritical of us to hold such things against them. As Freud pointed out long ago with his concept of the primitive “id” component of the human personality, such limbically driven, paleomammalian tendencies are still cozily ensconced in our own human brains.
In fact, before exploring how the evolution of theory of mind shook up our social behaviors, let’s have a look at what would have been the typical social behavioral profile for our ancestors
before
the evolution of a theory of mind. Presumably, it would have resembled chimpanzees’ behaviors today—not exactly, of course, because chimps have also evolved since the time of the parent species. But most experts still believe that chimpanzees are a “conservative” species, meaning that they probably haven’t changed all that much since the time we last shared a common ancestor with them about six million years ago.
If you’ve ever been to the ape exhibit at your local zoo, two things have probably struck you. First, chimpanzees are eerily similar to us, in both their behaviors and their appearance. And second, they have no shame. After all, complex social emotions such as shame and pride hinge on the presence of a theory of mind, because they involve taking the perspective of others in judging the self as having desirable or undesirable attributes. So it’s not that chimps and other primates “don’t care” what others think of them; it’s that, without a theory of mind (or at least one as all-consuming as ours), they lack the
capacity
to care. Perhaps it’s not too surprising, therefore, the sights you see at the resident primate house. All in plain view of each other, not to mention in plain view of your slack-jawed children, chimps will comfortably pass gas after copulating; cavalierly impose themselves onto screaming, hysterical partners; nonchalantly defecate into cupped hands; casually probe each others’ orifices with all manner of objects, organs, and appendages; and unhesitatingly avail themselves of their own manual pleasures. They will rob their elderly of covetous treats, happily ignore the plaintive cries of their sickly group members, and, when the situation calls for it, aggress against one another with a ravenous, loud, and unbridled rage.
With typical human jurisprudence, we recoil at the mere thought of these things, cover our children’s eyes, and laughingly dismiss these animals as “monkeys”—erroneously, I should add, because they are in fact great apes. What more should we ask of monkeys, we say, than for them to act as fools and clowns, as caricatures of humans, a fumbled half step toward the pinnacle of Creation? But of course that’s nonsense of papal proportions. Without a theory of mind, there’s simply not much reason to refrain from much of anything, except the physical presence of a dominant animal that would strike you if you did it wrong, or perhaps one that, through past exchanges, you’ve learned will alarm others at your provocation, thereby summoning an imposing figure to harm you on its behalf. Chimpanzees certainly have unspoken social norms, many of them quite complex. But without a theory of mind, one thing chimpanzees don’t have is the often crippling, inhibiting psychological sense of others watching, observing, and critically evaluating them.
Humans, unfortunately, are not so lucky. Owing to our evolved theory of mind, other people’s thoughts about us weigh especially heavy on our minds. You might claim that you don’t care what others think of you, and perhaps that’s truer for you than it is for many, but most people suffer immensely when perceived negative aspects of their identity—moral offenses, questionable intentions, embarrassing foibles, physical defects—are made known to others or are on the verge of exposure. In fact, we may well be the only species for which negative social-evaluative appraisals can lead to shame-induced suicide. You’d also be hard-pressed to find another animal that diets, wears toupees, injects botulism in its face, gets calf implants and boob jobs, or brandishes Gucci handbags, bleached teeth, and pierced navels, because all such vanity acts are meant to influence others’ perceptions of us. And, again, all require a theory of mind.
In no other work has the sheer potency of other minds been captured more vividly than in Sartre’s famous “Hell is other people” play,
No Exit
(1946). At the opening of the play, we are introduced to the three main characters, who find themselves each in the unenviable position of having just been cast into hell. There is Garcin, an assassinated left-wing journalist and draft dodger who believes he’s in hell because he mistreated his wife; Inez, a sadistic postal worker with a penchant for seducing other women; and Estelle, a pretty, pampered debutante who killed her baby and drove the penniless father to suicide. Strangers to one another, these three people find themselves locked together in an average drawing room with Second Empire furniture. By all appearances, they are each intelligent, sane, and able to think rationally about the situation. For some time after their deaths, they can even continue to observe their friends and loved ones on earth. So how is this hell?
Sartre proceeds to paint a scene so disturbing that it would make even the most rapacious sinner repent, if only to escape the unbearable fate of an eternity spent with others. Sartre’s allegory forces us to examine the subtle ways by which other people, by their sheer
mindful
presence, can affect us so strongly. For example, there are no mirrors or windows in the drawing room, sleep is not permitted, and the light is always on. The characters’ eyelids are paralyzed, disallowing them even the luxury of blinking. Garcin reacts with muted horror to the prospect of being constantly observed by Inez and Estelle, despite the professed goodwill of both. Tensions begin to mount, especially between Garcin and Inez:
To avoid unwittingly serving as one another’s torturers in hell, Garcin suggests that each person stare at the carpet and try to forget that the others are there. But Inez quips,
How utterly absurd! I feel you there, in every pore. Your silence clamors in my ears. You can nail up your mouth, cut your tongue out—but you can’t prevent your being there. Can you stop your thoughts? I hear them ticking away like a clock, tick-tock, tick-tock, and I’m certain you hear mine. [You’re] everywhere, and every sound comes to me soiled, because you’ve intercepted it on its way.
2
Later in the play, when Garcin goes to strangle Inez after she mercilessly teases him about his military desertion, she uses his awareness of her own judgment to drive the stake in ever further, instructing him to feel her eyes “boring into”
3
him:
You’re a coward, Garcin, because I wish it, I wish it—do you hear?—I wish it. And yet, just look at me, see how weak I am, a mere breath on the air, a gaze observing you, a formless thought that thinks you. Ah, they’re open now, those big hands, those coarse, man’s hands! But what do you hope to do? You can’t throttle thoughts with hands.
4
As Sartre so keenly observes in this play, when we really feel watched, our emotions and behaviors are strongly influenced. Nature has made us exquisitely—and sometimes painfully—aware of others’ eyes on us. Although these other minds may well be “mere breath on the air,” they still are a potentially deadly miasma.
Our acute sensitivity to being in the judgmental presence of others shouldn’t come as any surprise. Sartre’s theatrical drawing room simulating hell aside, it’s largely the reason why so many people stand primping for an hour or more before the mirror every morning, why we’re more vigilant about bagging our dogs’ bowel movements when they’ve done their business in front of an audience, why public speaking is many people’s worst nightmare, why stores invest so much money in cameras and security guards, and why we don’t publicly excavate our belly buttons for lint.
In fact, in a wide range of recent laboratory experiments, participants have been found to act more prosocially (and less antisocially) when cues in the environment suggest they’re being observed—even when these cues are just crude stimuli such as eyespots on a computer screen, or a toy placed on a bookshelf with big, prominent, humanlike eyes. We even leave better tips when there’s a picture of a pair of eyes posted above the tip jar.
Does this mean we’re well behaved only because we’re concerned about our reputations? Not entirely—not in the sense of being consciously aware of this motivation anyway. But because of our peculiar evolution, one that involved theory of mind, human minds are a buzzing hive of antagonistic forces and compulsions. We harbor a beast within. It’s not the devil, of course, but nature itself. Owing to the scaffolding of our psychological evolution, in which being impulsive, hedonistic, and uninhibited tended to pay off for our ancestors for millions of years, we’ve each got a bit of a rotten core, that id-like carnal base that makes us lust after those we shouldn’t, fantasize about things we oughtn’t, and dream about doing things we mustn’t. If, for example, a particularly invasive government, in attempting to better understand human nature from a scientific perspective, issued a decree for all men to wear a penile plethysmograph (a device measuring blood flow to the penis, which some people therefore believe serves as an objective measure of sexual arousal) throughout the day, we’d discover a lot of conscientious objectors. Ostensibly, men would refuse to go along with the mandate on principle: how dare the authorities even suggest such a thing, they’d say; this is a case of a government going too far. But the refusal would also be for fear of what such a study would reveal about our hidden natures, because sexual arousal is one of the few things not easily tamed by moral willpower.
If you recall from your basic biology tutorials, there’s one thing that Darwinian evolution can’t do, which is to start over from scratch using a blank genetic canvas. As a general rule, we humans don’t like to conceive of ourselves as animals. The trouble, however, is that that is precisely what we are. Nature has left many remembrances of our primal past embedded in our inherited genotypes. These include a truncated, undeveloped tail at the base of our spines and, in women, two milk-filled udders that swell up over pregnancy with teats perfect for suckling. Although they’ve become blunted over the past few million years, perhaps nothing reminds us of our animal natures more than the intimate touch of our own clumsy tongues palpating against the enameled tips of our canines, those carnivorous tools designed so clearly for defleshing the bones of our prey, and occasionally for organic weaponry.
The human brain, like any other evolved physical trait or organ, is similarly built upon precursory parts from our deep mammalian past. This is because natural selection works by taking a common structure and modifying it over time. It can build on features that are already there (such as what happened with the primate visual cortex as our eyesight became more and more important for life in the trees), shrink them for the sake of neural ergonomics (such as the concomitant decrease in the primate olfactory system as our sense of smell became less and less important), or tinker with articulating structures so that they work together more efficiently. As a result, and because there’s no foresight in evolution, human beings are in many ways suboptimally designed: nature could only work with the building blocks that came before. To accommodate our impressive brainpower, for example, our skulls became so large, so quickly, that compared with those of other mammalian species, women’s pelvic regions are generally too narrow for giving birth. So maternal and infant mortality rates in humans are unprecedented among the rest of the primates. Yet, in terms of net genetic success, the adaptive payoffs of our heightened cognitive abilities as allowed by these large, big brain-swaddling craniums apparently outweighed such increased risks during childbirth.
We often find just this sort of give-and-take in evolution. Nature endows a species with an improved design, but not without some cost. In the case we’re most interested in here—theory of mind—our species evolved the comparatively novel capacity to reason about the unobservable mental states of others. Theory of mind had enormous survival value because it allowed our ancestors to be empathic and intensely cooperative, not to mention more Machiavellian and strategic by deliberately deceiving competitors. But the more ancient, “pre–theory of mind” brain never went away, nor did the impulsive, hedonistic, and uninhibited drives that came along with it. So, like the old primate pelvic bones forced to accommodate our new super-sized human heads, there is a dangerous friction between these “old” (pre–theory of mind) and “new” (post–theory of mind) elements of our social brains, a psychological friction that continues to jeopardize our reproduction and survival to this day.
On the one hand, our “old” brains urge us to let it all hang out, as they say—genitals, passions, warts, and all. Again, apart from the threat of looming physical violence, there’s no reason to hide one’s peccadilloes if one does not have pride or shame. Without a theory of mind, I might as well do things in front of you that I’d do in front of any inanimate object; if I can’t conceptualize your mind, you’re basically just a piece of furniture, and I can hardly worry about what you’re thinking about me. Frankly, says the old part of my brain, why do I care about whether the kitchen sink or the chair in the corner can see what I’m up to, much less you? But, on the other hand, our “new” brains are constantly shouting out to us, “Whoa! Not so fast! Think about the implications!” With a theory of mind, then, I realize that once you see me do X, Y, or Z (use your imagination—the more sordid the better), you
know
this about me. So, yeah, you better believe I’m going to think twice before satisfying my own immediate self-interests.