Authors: Atul Gawande
Even worse than losing self-confidence, though, is reacting defensively. There are surgeons who will see faults everywhere except in themselves. They have no questions and no fears about their abilities. As a result, they learn nothing from their mistakes and know nothing of their limitations. As one surgeon told me, it is a rare but alarming thing to meet a surgeon without fear. “If you’re not a little afraid when you operate,” he said, “you’re bound to do a patient a grave disservice.”
The atmosphere at the M & M is meant to discourage both attitudes—self-doubt and denial—for the M & M is a cultural ritual that
inculcates in surgeons a “correct” view of mistakes. “What would you do differently?” a chairman asks concerning cases of avoidable harm. “Nothing” is seldom an acceptable answer.
In its way, the M & M is an impressively sophisticated and human institution. Unlike the courts or the media, it recognizes that human error is generally not something that can be deterred by punishment. The M & M sees avoiding error as largely a matter of will—of staying sufficiently informed and alert to anticipate the myriad ways that things can go wrong and then trying to head off each potential problem before it happens. It isn’t damnable that an error occurs, but there is some shame to it. In fact, the M & M’s ethos can seem paradoxical. On the one hand, it reinforces the very American idea that error is intolerable. On the other hand, the very existence of the M & M, its place on the weekly schedule, amounts to an acknowledgment that mistakes are an inevitable part of medicine.
But why do they happen so often? Lucian Leape, medicine’s leading expert on error, points out that many other industries—whether the task is manufacturing semiconductors or serving customers at the Ritz-Carlton—simply wouldn’t countenance error rates like those in hospitals. The aviation industry has reduced the frequency of operational errors to one in a hundred thousand flights, and most of those errors have no harmful consequences. The buzzword at General Electric these days is “Six Sigma,” meaning that its goal is to make product defects so rare that in statistical terms they are more than six standard deviations away from being a matter of chance—almost a one-in-a-million occurrence.
Of course, patients are far more complicated and idiosyncratic than airplanes, and medicine isn’t a matter of delivering a fixed product or even a catalogue of products; it may well be more complex than just about any other field of human endeavor. Yet everything we’ve learned in the past two decades—from cognitive psychology, from “human factors” engineering, from studies of disasters like Three Mile Island and Bhopal—has yielded the same insights: not
only do all human beings err, but they err frequently and in predictable, patterned ways. And systems that do not adjust for these realities can end up exacerbating rather than eliminating error.
The British psychologist James Reason argues, in his book
Human Error
, that our propensity for certain types of error is the price we pay for the brain’s remarkable ability to think and act intuitively—to sift quickly through the sensory information that constantly bombards us without wasting time trying to work through every situation anew. Thus systems that rely on human perfection present what Reason calls “latent errors”—errors waiting to happen. Medicine teems with examples. Take writing out a prescription, a rote procedure that relies on memory and attention, which we know are unreliable. Inevitably, a physician will sometimes specify the wrong dose or the wrong drug. Even when the prescription is written correctly, there’s a risk that it will be misread. (Computerized ordering systems can almost eliminate errors of this kind, but only a small minority of hospitals have adopted them.) Medical equipment, which manufacturers often build without human operators in mind, is another area rife with latent errors: one reason physicians are bound to have problems when they use cardiac defibrillators is that the devices have no standard design. You can also make the case that onerous workloads, chaotic environments, and inadequate team communication all represent latent errors in the system.
James Reason makes another important observation: disasters do not simply occur; they evolve. In complex systems, a single failure rarely leads to harm. Human beings are impressively good at adjusting when an error becomes apparent, and systems often have built-in defenses. For example, pharmacists and nurses routinely check and countercheck physicians’ orders. But errors do not always become apparent, and backup systems themselves often fail as a result of latent errors. A pharmacist forgets to check one of a thousand prescriptions. A machine’s alarm bell malfunctions. The one attending
trauma surgeon available gets stuck in the operating room. When things go wrong, it is usually because a series of failures conspires to produce disaster.
The M & M takes none of this into account. For that reason, many experts see it as a rather shabby approach to analyzing error and improving performance in medicine. It isn’t enough to ask what a clinician could or should have done differently so that he and others may learn for next time. The doctor is often only the final actor in a chain of events that set him or her up to fail. Error experts, therefore, believe that it’s the process, not the individuals in it, that requires closer examination and correction. In a sense, they want to industrialize medicine. And they can already claim successes: the Shouldice Hospital’s “focused factory” for hernia operations, for one—and far more broadly, the entire specialty of anesthesiology, which has adopted their precepts and seen extraordinary results.
At the center of the emblem of the American Society of Anesthesiologists is a single word: “Vigilance.” When you put a patient to sleep under general anesthesia, you assume almost complete control of the patient’s body. The body is paralyzed, the brain rendered unconscious, and machines are hooked up to control breathing, heart rate, blood pressure—all the vital functions. Given the complexity of the machinery and of the human body, there are a seemingly infinite number of ways in which things can go wrong, even in minor surgery. And yet anesthesiologists have found that if problems are detected they can usually be solved. In the 1940s, there was only one death resulting from anesthesia in every twenty-five hundred operations, and between the 1960s and the 1980s the rate had stabilized at one or two in every ten thousand operations.
But Ellison (Jeep) Pierce had always regarded even that rate as unconscionable. From the time he began practicing, in 1960, as a young anesthesiologist out of North Carolina and the University of
Pennsylvania, he had maintained a case file of details from all the deadly anesthetic accidents he had come across or participated in. But it was one case in particular that galvanized him. Friends of his had taken their eighteen-year-old daughter to the hospital to have her wisdom teeth pulled, under general anesthesia. The anesthesiologist inserted the breathing tube into her esophagus instead of her trachea, which is a relatively common mishap, and then failed to spot the error, which is not. Deprived of oxygen, she died within minutes. Pierce knew that a one-in-ten-thousand death rate, given that anesthesia was administered in the United States an estimated thirty-five million times each year, meant thirty-five hundred avoidable deaths like that one.
In 1982, Pierce was elected vice president of the American Society of Anesthesiologists and got an opportunity to do something about the death rate. The same year, ABC’s
20/20
aired an exposé that caused a considerable stir in his profession. The segment began, “If you are going to go into anesthesia, you are going on a long trip, and you should not do it if you can avoid it in any way. General anesthesia [is] safe most of the time, but there are dangers from human error, carelessness, and a critical shortage of anesthesiologists. This year, six thousand patients will die or suffer brain damage.” The program presented several terrifying cases from around the country. Between the small crisis that the show created and the sharp increases in physicians’ malpractice insurance premiums at that time, Pierce was able to mobilize the Society of Anesthesiologists to focus on the problem of error.
He turned for ideas not to a physician but to an engineer named Jeffrey Cooper, the lead author of a groundbreaking 1978 paper entitled “Preventable Anesthesia Mishaps: A Study of Human Factors.” An unassuming, fastidious man, Cooper had been hired in 1972, when he was twenty-six years old, by the Massachusetts General Hospital bioengineering unit, to work on developing machines for anesthesiology researchers. He gravitated toward the operating room,
however, and spent hours there observing the anesthesiologists, and one of the first things he noticed was how poorly the anesthesia machines were designed. For example, a clockwise turn of a dial decreased the concentration of potent anesthetics in about half the machines but increased the concentration in the other half. He decided to borrow a technique called “critical incident analysis”—which had been used since the 1950s to analyze mishaps in aviation—in an effort to learn how equipment might be contributing to errors in anesthesia. The technique is built around carefully conducted interviews, designed to capture as much detail as possible about dangerous incidents: how specific accidents evolved and what factors contributed to them. This information is then used to look for patterns among different cases.
Getting open, honest reporting is crucial. The Federal Aviation Administration has a formalized system for analyzing and reporting dangerous aviation incidents, and its enormous success in improving airline safety rests on two cornerstones. Pilots who report an incident within ten days have automatic immunity from punishment, and the reports go to a neutral, outside agency, NASA, which has no interest in using the information against individual pilots. For Jeffrey Cooper, it was probably an advantage that he was an engineer and not a physician, so that anesthesiologists regarded him as a discreet, unthreatening researcher.
The result was the first in-depth scientific look at errors in medicine. His detailed analysis of three hundred and fifty-nine errors provided a view of the profession unlike anything that had been seen before. Contrary to the prevailing assumption that the start of anesthesia (“takeoff”) was the most dangerous part, anesthesiologists learned that incidents tended to occur in the middle of anesthesia, when vigilance waned. The most common kind of incident involved errors in maintaining the patient’s breathing, and these were usually the result of an undetected disconnection or misconnection of the breathing tubing, mistakes in managing the airway, or mistakes in
using the anesthesia machine. Just as important, Cooper enumerated a list of contributory factors, including inadequate experience, inadequate familiarity with equipment, poor communication among team members, haste, inattention, and fatigue.
The study provoked widespread debate among anesthesiologists, but there was no concerted effort to solve the problems until Jeep Pierce came along. Through the anesthesiology society at first, and then through a foundation that he started, Pierce directed funding into research on how to reduce the problems Cooper had identified, sponsored an international conference to gather ideas from around the world, and brought anesthesia machine designers into safety discussions.
It all worked. Hours for anesthesiology residents were shortened. Manufacturers began redesigning their machines with fallible human beings in mind. Dials were standardized to turn in a uniform direction; locks were put in to prevent accidental administration of more than one anesthetic gas; controls were changed so that oxygen delivery could not be turned down to zero.
Where errors could not be eliminated directly, anesthesiologists began looking for reliable means of detecting them earlier. For example, because the trachea and the esophagus are so close together, it is almost inevitable that an anesthesiologist will sometimes put the breathing tube down the wrong pipe. Anesthesiologists had always checked for this by listening with a stethoscope for breath sounds over both lungs. But Cooper had turned up a surprising number of mishaps—like the one that befell the daughter of Pierce’s friends—involving undetected esophageal intubations. Something more effective was needed. In fact, monitors that could detect this kind of error had been available for years, but, in part because of their expense, relatively few anesthesiologists used them. One type of monitor could verify that the tube was in the trachea by detecting carbon dioxide being exhaled from the lungs. Another type, the pulse oximeter, tracked blood oxygen levels, thereby providing an
early warning that something was wrong with the patient’s breathing system. Prodded by Pierce and others, the anesthesiology society made the use of both types of monitor for every patient receiving general anesthesia an official standard. Today, anesthesia deaths from misconnecting the breathing system or intubating the esophagus rather than the trachea are virtually unknown. In a decade, the overall death rate dropped to just one in more than two hundred thousand cases—less than a twentieth of what it had been.
And the reformers have not stopped there. David Gaba, a professor of anesthesiology at Stanford, has focused on improving human performance. In aviation, he points out, pilot experience is recognized to be invaluable but insufficient: pilots seldom have direct experience with serious plane malfunctions anymore. They are therefore required to undergo yearly training in crisis simulators. Why not doctors, too?
Gaba, a physician with training in engineering, led in the design of an anesthesia-simulation system known as the Eagle Patient Simulator. It is a life-size, computer-driven mannequin that is capable of amazingly realistic behavior. It has a circulation, a heartbeat, and lungs that take in oxygen and expire carbon dioxide. If you inject drugs into it or administer inhaled anesthetics, it will detect the type and amount, and its heart rate, its blood pressure, and its oxygen levels will respond appropriately. The “patient” can be made to develop airway swelling, bleeding, and heart disturbances. The mannequin is laid on an operating table in a simulation room equipped exactly like the real thing. Here both residents and experienced attending physicians learn to perform effectively in all kinds of dangerous, and sometimes freak, scenarios: an anesthesia machine malfunction, a power outage, a patient who goes into cardiac arrest during surgery, and even a cesarean-section patient whose airway shuts down and who requires an emergency tracheostomy.