When they stopped for gas a few hours later, Daniel crept out the window and climbed over the roof of the car. Horrified, his mother commanded, “Stop that! Daniel replied, “You stop it!” and crawled down the windshield. With Daniel back in the car, the family drove on. Daniel found the cell phone again; this time he threw it on the
floor and broke it. As this little Napoleon grew older, Daniel saw just how easily he could ignore his family’s social boundaries, then any social boundaries. He got used to demanding his way everywhere. He started hitting kids at school who did not pay attention to him. He developed a sulfurous relationship with authority. He stole things from classmates. Eventually, his moral clutch slipped completely, and he stabbed a little girl in the cheek with a pencil. He was expelled from school. As of this writing, the family is embroiled in a lawsuit, as is the school.
Daniel was a behavioral—one is tempted to say
moral
—wreck. Though it is easy to be a back-seat parent, each year there seems to be a bumper crop of out-of-control kids and helpless parents. No loving parent wants to raise a Daniel. In this chapter, we’ll talk about how to avoid doing so. You
can
create moral maturity in most children. And, perhaps surprisingly, there is neuroscience behind it.
Are babies born moral?
What exactly does “moral mean? Are any moral absolutes embedded into our brains, or is moral awareness only culturally understood? These questions have occupied philosophers for centuries. The word “moral”, in both its Greek and Latin incarnations, has a strong social underpinning. It originally outlined a code of conduct, a consensus of manners and customs possessing equal parts “heartily recommend it” and “don’t you dare.” That’s the definition we’ll use: a set of value-laden behaviors embraced by a cultural group whose main function is to guide social behavior.
Why would we need such rules in the first place? It may have to do with that strong evolutionary requirement for social cooperation. Some researchers believe our moral sense—really a specific suite of socializing behaviors—developed to aid that cooperation. Regular massacres, after all, are not exactly in the best interests of a species whose effective founding population was less than 18,500 individuals
(some say less than 2,000). In this Darwinian view, our brains come preloaded at birth with certain limited moral sensibilities, which then develop in a semi-variable fashion, depending upon how we are raised. “We are born with a universal moral grammar, says cognitive scientist Steven Pinker, “that forces us to analyze human action in terms of its moral structure.”
Popular candidates for which moral sensibilities we possess include distinctions between right and wrong; proscriptions against social violence such as rape and murder; and empathy. Yale psychologist Paul Bloom lists a sense of justice, emotional responses to thoughtfulness and altruism, and a willingness to judge another person’s behavior. Psychologist Jon Haidt sees five categories: harm, fairness, loyalty, respect for authority, and something intriguingly called spiritual purity.
If such moral sensibilities are an innate part of our brain’s function, we might be able to see shards of them in some of our evolutionary neighbors. And we can, looking no further than a zoo in England. Kuni, a female chimpanzee, lived in a zoo enclosure that was part glass and part open air, mostly surrounded by a moat. One day a starling hit the glass wall and fell into the cage, and the chimp captured it. Though apparently stunned, the bird was not physically hurt, and the keeper urged the chimp to let it go.
What Kuni did next was extraordinary. She picked up the limp bird, put it on its feet, and tossed it a short distance. The bird didn’t revive. Kuni seemed to think about this, then devised a strategy. She picked up the starling with one hand and climbed to the top of the highest tree in the enclosure with the other hand, looking like King Kong with an avian Fay Wray. The ape wrapped her legs around the trunk of the tree, freeing both hands to work with the starling. With great dexterity, she grabbed both wings of the bird—one in each hand—and carefully unfolded them. Spreading the wings wide, she next threw the bird as hard as she could in the general direction of freedom. The bird missed the moat and landed just inside the bank,
where a curious juvenile ape came over to investigate. Kuni quickly scrambled down and stood guard over the bird for a long time. She stayed at her post until the bird could fly away on its own.
This is an extraordinary example of … something. Though we cannot get into the mind of a chimp, it is one of a host of observations that suggest animals have an active emotional life, including, perhaps, notional altruism. Humans tend to have that altruistic quality in spades, and in much more sophisticated forms than our genetic neighbors.
If moral awareness is universal, we might also expect to see general agreement across cultures. Harvard researchers developed a Moral Sense Test, which hundreds of thousands of people from more than 120 countries have taken. (You can take it, too, at
http://moral.wjh.harvard.edu
.
) The data they’ve compiled appear to confirm a universal moral sense.
A third hint that moral awareness is innate, which we’ll get to in a few pages, has to do with the fact that damage to a specific part of the brain can affect the ability to make certain types of moral decisions.
Why don’t kids just do the right thing?
If children are born with an innate sense of right and wrong, why don’t they just
do
the right thing—especially as they get older (puberty comes to mind)?
Turns out it’s surprisingly difficult to explain proactive moral behavior, like voluntarily helping someone cross the street. Even enlightened self-interest does not fully explain certain types of human altruism. The road between moral reasoning and moral behavior is quite rocky. The concept of “conscience” was developed in a partial attempt to pave over this difficulty. A conscience is something that makes you feel good when you do good things and makes you feel bad when you don’t. The late Harvard psychologist Lawrence Kohlberg believed that a healthy conscience was the top rung in the ladder of
all moral reasoning. But not all scientists think conscience is innate. Some think it’s a social construct. For them, internalization is the most important measure of moral awareness.
A child who can resist the temptation to defy some moral norm,
even when the possibility of detection and punishment is zero,
has internalized the rule. They not only know what is proper (an awareness that might have been preloaded into their brains), but they now agree with it and attempt to align their behaviors accordingly. This is also sometimes called inhibitory control, which sounds suspiciously like well-developed executive function. They may be the same thing.
Either way, a willingness to make the right choices—and to withstand pressure to make the wrong ones, even in the absence of a credible threat or the presence of a reward—is the goal of moral development. Which means your parenting objective is to get your child to pay attention to and align himself with his innate sense of right and wrong.
This takes time. A lot of time.
One lie every two hours
One reason we know this is by the way kids lie, which changes with their age. I once heard a psychology professor discuss what happens when a child first becomes capable of lying, and he livened up his talk with an old Bill Cosby routine. With apologies to both the professor and Cosby, here is my recollection of his story.
Bill and his brother Russell were jumping on the bed in the middle of the night—in violation of their parents strictest orders. They broke the bed frame, and the snap and crash awakened a furious father. Dad stormed into the room, pointed to the broken furniture, and bellowed, “Did you do this? The older boy stammered, “No, dad! I didn’t do it!” Then the boy paused, a light jumping into his eyes. “But I know who did. A teenager came into our room from the bedroom window. He jumped up and down on the bed 10 times and
broke it, then he leaped out the window and ran down the street!” The dad’s brow wrinkled. “Son, there is no window in this room.” The boy didn’t miss a beat. “I know, Dad! He took it with him!”
Yes, children are bad at lying, at least at first. In the magical fairy dust of the childhood mind, kids initially have a hard time distinguishing reality from fancy, which you can see in their eagerness to engage in imaginative play. They also perceive their parents to be essentially omniscient, a belief that won’t be completely destroyed until the 20-kiloton blast of puberty. The fuse gets lit early, though, around 36 months, when kids begin to realize that parents can’t always read their minds. To their delight (or horror), children discover they can give their parents false information without its being detected. Or, as the Cosby story relates, they
think
they can. The child’s realization that you can’t always read his or her mind coincides with the flowering of something we call Theory of Mind skills.
Theory of Mind develops over time
What is Theory of Mind? A literary example may help explain it. Ernest Hemingway was once challenged to write an entire novel in only six words, and what he wrote is a perfect illustration of Theory of Mind. That’s because when you read it, it will activate yours.
For sale: Baby shoes. Never used.
Do these six words make you sad? Make you wonder what happened to the person who wrote the advertisement? Can you infer that person’s mental state?
Most humans can, and we use Theory of Mind skills to do it. The basis of these skills is the understanding that another’s behavior is motivated by a range of mental states—beliefs, intentions, desires, perceptions, emotions. Theory of Mind, first coined by noted primatologist David Premack, has two general components. The first is the
ability to discern someone else’s psychological state. The second is the realization that although these states may be different from your own, they are still valid for the person with whom you are interacting. You develop a theory of how the other person’s mind works, even if it differs from your own.
Those six words could have been written by a couple whose baby died shortly after birth, and you feel the pangs of their sadness. You may never have experienced the grief of having lost a child; you may not even have children. Nonetheless, using your advanced Theory of Mind skills, you can experience their reality and empathize. The shortest novel in the world can reveal a universe of feeling because of it. Hemingway considered it his best work.
By age 4, a child will lie about once every two hours; by age 6, it’s once every 90 minutes.
Even though Theory of Mind is a hallmark of human behavior, we don’t think it is fully developed at birth. It is extremely difficult to measure in very young children. The skill appears instead to unfold progressively, influenced by social experiences. You can see this timeline in the way kids lie. Pulling the wool over someone’s eyes requires Theory of Mind—the ability to peer into someone else’s mind and predict what they will think if you tell them one thing. The talent improves over time.
After age 3, kids begin to lie in earnest, though they usually do so imperfectly. They pick up speed on the nasty habit with astonishing frequency. By age 4, a child will tell a lie about once every two hours; by age 6, she will do it every 90 minutes. As a child grows in vocabulary and social experience, the lies become more sophisticated, more prevalent, and harder to spot.
This timeline suggested to researchers that children have an age-dependent relationship with certain types of moral reasoning, too. Kids might be born with certain moral instincts, but it takes a while to coax them into their mature form.
How moral reasoning develops
Kohlberg, the Harvard psychologist, believed that moral reasoning depended upon general cognitive maturity—another way of saying that these things take time. If indeed decisions have strong emotional roots, as we will explore, I would also argue that moral reasoning depends upon emotional maturity. Though Kohlberg has his critics, his ideas remain influential, as do those of his intellectual mentor, Jean Piaget. The ideas of both men have been applied in schools, juvenile detention facilities, even prisons. Kohlberg outlined a progressive process for moral development:
1. Avoiding punishment. Moral reasoning starts out at a fairly primitive level, focused mostly on avoiding punishment. Kohlberg calls this stage pre-conventional moral reasoning.
2. Considering consequences. As a child’s mind develops, she begins to consider the social consequences of her behaviors and starts to modify them accordingly. Kohlberg terms this conventional moral reasoning.
3. Acting on principle. Eventually, the child begins to base her behavioral choices on well-thought-out, objective moral principles, not just on avoidance of punishment or peer acceptance. Kohlberg calls this coveted stage post-conventional moral reasoning. One could argue that the goal of any parent is to land here.
Kids don’t necessarily arrive at this third stage all by themselves. Along with time and experience, it can take a wise parent to get a child to consistently behave in a manner congruent with his or her inborn moral grammar. Part of the reason it’s tough is that when children observe bad behavior, they have
learned
it. Even if the bad behavior is punished, it remains easily accessible in the child’s
brain. Psychologist Albert Bandura was able to show this, with help from a clown.
Lessons from Bobo the clown
In the 1960s, Bandura showed preschoolers a film involving a Bobo doll, one of those inflatable plastic clowns weighted on the bottom. In the film, an adult named Susan kicks and punches the doll, then repeatedly clobbers it with a hammer—buckets o violence. After the film, the preschoolers are taken into another room filled with toys, including (surprise) a Bobo doll and a toy hammer. What do the children do? It depends.