Read Happy Accidents: Serendipity in Major Medical Breakthroughs in the Twentieth Century Online

Authors: Morton A. Meyers

Tags: #Health & Fitness, #Reference, #Technology & Engineering, #Biomedical

Happy Accidents: Serendipity in Major Medical Breakthroughs in the Twentieth Century (42 page)

BOOK: Happy Accidents: Serendipity in Major Medical Breakthroughs in the Twentieth Century
10.83Mb size Format: txt, pdf, ePub
ads

Hallucinogens were being used in the treatment of depression and schizophrenia at the University of Zurich. Members of the Native American Church of North America had volunteered to form the core of a study of the influence of mescaline on alcoholism. Psilocybin, a hallucinogen isolated from the Mexican “sacred mushroom” by Albert Hofmann in 1958 and subsequently synthesized, was undergoing FDA-approved trials at the University of Arizona to see if it could allay the symptoms of obsessive-compulsive disorder.
15
As of 2006 LSD was being reevaluated for use in the reduction of pain and fear in dying patients.
16

A new era of biological psychiatry characterized by advances in the understanding of brain chemistry had dawned, bringing with it profound cultural changes. But the drugs are far from perfect. The claims made for psychotropic medications are sometimes overblown, with too little awareness of their limitations. For instance, about 30 percent of people who try antidepressants are not helped by them. Side effects include an increased risk of suicidal thoughts and behavior in children and adolescents. As research marches on, investigators are working to devise different treatments for the various subtypes of depression.

There are three main fronts in the current research on mental illness: the possibility of other neurotransmitters yet to be found (besides the four known ones: serotonin, dopamine, norepinephrine, and GABA); investigations into physical changes that are now known to
occur in the brain (as revealed by modern brain-imaging techniques); and research into possible genetic causes or predispositions.

In an astonishingly rapid series of chance occurrences in the 1940s and 1950s, mood stabilizers, behavior-conditioning drugs, tranquilizers, antidepressants, and antianxiety drugs all became part of the human experience. There is no question that serendipity, not deductive reasoning, was the force behind virtually all these exciting discoveries in the psychotropics.

There is little doubt that the advances of the future, like those of the past, will come about under the good graces of both luck and sagacity.

Conclusion

Taking a Chance on Chance: Cultivating Serendipity

In his farewell address on January 17, 1961, President Dwight Eisenhower famously cautioned the nation about the influence of the “military-industrial complex,” coining a phrase that became part of the political vernacular. However, in the same speech, he presciently warned that scientific and academic research might become too dependent on, and thus shaped by, government grants. He foresaw a situation in which “a government contract becomes virtually a substitute for intellectual curiosity.”

As we have seen, many of the most essential medical discoveries in history came about not because someone came up with a hypothesis, tested it, and discovered that it was correct, but more typically because someone stumbled upon an answer and, after some creative thought, figured out what problem had been inadvertently solved. Such investigations were driven by curiosity, creativity, and often disregard for conventional wisdom. While one might assume that as technology continues to advance, so will major medical discoveries, the truth is that over the past two decades there has been a marked decline in major discoveries. The current system of medical research simply does not foster such serendipitous discoveries. Why is this so, and what can we do about it?

In earlier times, discoverers were more apt to be motivated in their pursuits by personal curiosity. In the nineteenth century, the typical discoverer was not a professional scientist but a well-educated gentleman of independent means (like Charles Darwin) tinkering
around on his estate, designing experiments and conceptualizing patterns. There were no committees or granting agencies that had to be romanced and persuaded of the worth of a particular project. Such an individual, driven by nothing other than intellectual curiosity, was open to unexpected observations and would pursue them wherever they led with no restrictions.

Even in the early twentieth century, the climate was more conducive to serendipitous discovery. In the United States, for example, scientific research was funded by private foundations, notably the Rockefeller Institute for Medical Research in New York (established 1901) and the Rockefeller Foundation (1913). The Rockefeller Institute modeled itself on prestigious European organizations such as the Pasteur Institute in France and the Koch Institute in Germany, recruiting the world's best scientists and providing them with comfortable stipends, well-equipped laboratories, and freedom from teaching obligations and university politics, so that they could devote their energies to research. The Rockefeller Foundation, which was the most expansive supporter of basic research, especially in biology, between the two world wars, relied on successful programs to seek promising scientists to identify and accelerate burgeoning fields of interest. In Britain, too, the Medical Research Council believed in “picking the man, not the project,” and nurturing successful results with progressive grants.

After World War II, everything about scientific research changed. The U.S. government—which previously had had little to do with funding research except for some agricultural projects—took on a major role. The National Institutes of Health (NIH) grew out of feeble beginnings in 1930 but became foremost among the granting agencies in the early 1940s at around the time they moved to Bethesda, Maryland. The government then established the National Science Foundation (NSF) in 1950 to promote progress in science and engineering.
1
Research in the United States became centralized and therefore suffused with bureaucracy. The lone scientist working independently was now a rarity. Research came to be characterized by large teams drawing upon multiple scientific disciplines and using highly technical methods in an environment that promoted the not-very-creative phenomenon known as “groupthink.” Under this new regime, the competition
among researchers for grant approvals fostered a kind of conformity with existing dogma. As the bureaucracy of granting agencies expanded, planning and justification became the order of the day, thwarting the climate in which imaginative thought and creative ideas flourish.

About 90 percent of NIH-funded research is carried out not by the NIH itself on its Bethesda campus but by other (mostly academic medical) organizations throughout the country. The NIH gets more than 43,000 applications a year. Through several stages of review, panels of experts in different fields evaluate the applications and choose which deserve to be funded. About 22 percent are approved for a period of three to five years. The typical grant recipient works at a university and does not draw a salary but is dependent on NIH funding for his or her livelihood. After the three- to five-year grant expires, the researcher has to renew the funding. The pressure is typically even greater the second time around, as the university has gotten used to “absorbing” up to 50 percent of the grant money to use for “overhead,” and by now the scientist has a staff of paid postdocs and graduate students who depend on the funding, not to mention the fact that the continuation of the scientist's faculty position is at stake.

Inherent in the system is a mindset of conformity: one will tend to submit only proposals that are likely to be approved, which is to say, those that conform to the beliefs of most members on the committee of experts. Because of the intense competition for limited money, investigators are reluctant to submit novel or maverick proposals. Needless to say, this environment stifles the spirit of innovation. Taking risks, pioneering new paths, thwarting conventional wisdom—the very things one associates with the wild-eyed, wild-haired scientists of the past—don't much enter into the picture nowadays.

These realities of how science is practiced lead to a systemic problem of scientists working essentially with blinders on. Research is “targeted” toward a specifically defined goal with a carefully laid-out plan of procedures and experiments. There is no real room for significant deviation. A researcher can make a supplementary request if an unanticipated finding occurs, as long as its pursuit falls within the general theme—the presumptive goal—that was originally funded. In
2005 the NIH received a meager 170 such applications for supplementary grants, only 51 of which were funded. But if, for instance, someone is funded to study the effect of diet on ulcers, that person wouldn't go where the evidence of a bacterial cause might lead. Alexander Fleming would not have been funded. Nor would Barnet Rosenberg in his incidental observation that eventually led to the discovery of the chemotherapeutic agent cisplatin.

In the past, the real advances in medicine have often come not from research specifically directed against a target but rather from discoveries made in fields other than the one being studied. In cancer, chemotherapy arose from the development of instruments of chemical warfare, the study of nutritional disease, and the effect of electric current on bacterial growth. A major tumor-suppressor gene was discovered through research on polio vaccines. Stem cells were discovered through research on bone-marrow transplants for leukemia and for whole-body irradiation from nuclear weapons. Agents for alteration of mood and behavior came from attempts to combat surgical shock, the treatment of tuberculosis, and the search for antibiotics. When scientists were allowed to pursue whatever they found, serendipitous discovery flourished.

Today, targeted research is pretty much all there is. Yet, as Richard Feynman put it in his typical rough-hewn but insightful manner, giving more money “just increases the number of guys following the comet head.”
2
Money doesn't foster new ideas, ideas that drive science; it only fosters applications of old ideas, most often enabling improvements but not discoveries.

P
EER
R
EVIEW

The government does not employ scientists to work at the NIH or the NSF to review grant applications. Rather, the task is accomplished through a system known as “peer review”: independent qualied experts judge the work of their “peers” on a pro bono basis.
3
The system of peer review within government granting agencies developed after World War II as a noble attempt to keep science apolitical by keeping the decision-making regarding how research money is spent within
the scientific community. The same system is also used by scientific journals to determine whether the results of a research study are solid enough to be published. However, questions regarding the peer review system's flaws began to arise in the 1970s and became more insistent by the mid-1990s.

In 2005 the NIH invested more than $28 billion in medical research. Competitive grants went to 9,600 grant projects at some 2,800 universities, medical schools, and other research institutions in the United States and around the world. An applicant for a research grant is expected to have a clearly defined program for a period of three to five years. Implicit is the assumption that nothing unforeseen will be discovered during that time and, even if something were, it would not cause distraction from the approved line of research. Yet the reality is that many medical discoveries were made by researchers working on the basis of a fallacious hypothesis that led them down an unexpected fortuitous path. Paul Ehrlich, for example, began with a false belief regarding the classification of the syphilis spirochete that nevertheless set him upon the right road to find a cure. Ladislaus von Meduna incorrectly hypothesized that schizophrenia was incompatible with epilepsy, but this led to a breakthrough convulsive therapy. John Cade, by pursuing a fanciful speculation down a false trail, was led to discover the value of lithium for treating mania. Today such people, if funded by the NIH or NSF, would not be allowed to stray from their original agenda.

The peer review system forces investigators to work on problems others think are important and to describe the work in a way that convinces the reviewers that results will be obtained. This is precisely what prevents funded work from being highly preliminary, speculative, or radical.
4
How can a venture into the unknown offer predictability of results? Biochemist Stanley Cohen, who, along with Herbert Boyer, was the first to splice and recombine genes, showed how little he thought of peer review when he speculated: “Herb and I didn't set out to invent genetic engineering. Were we to make the same proposals today, some peer review committees would come to the conclusion that the research had too small a likelihood of success to merit support.”
5

Peer review institutionalizes dogmatism by promoting orthodoxy. Reviewers prefer applications that mesh with their own perspective on how an issue should be conceptualized, and they favor individuals whom they know or whose reputations have already been established, making it harder for new people to break into the system.
6
Indeed, the basic process of peer review demands conformity of thinking and disdains a maverick's approach. “We can hardly expect a committee,” said the biologist and historian of science, Garrett Hardin, “to acquiesce in the dethronement of tradition. Only an individual can do that.”
7
Young investigators get the message loud and clear: Do not challenge existing beliefs and practices.

So enmeshed in the conventional wisdoms of the day, so-called “peers” have again and again failed to appreciate major breakthroughs even when they were staring them in the face. This reality is evidenced by the fact that so many pioneering researchers were inappropriately scheduled to present their findings at undesirable times when few people were in the audience to hear about them. Consider the following examples:

• The psychiatrist Pierre Deniker's presentation on the positive results of clinical trials of Thorazine at the Fiftieth French Congress of Psychiatry and Neurology in Paris in July 1952 was scheduled at the end of the last session of the week during the lunch hour and was delivered to not more than twenty registrants in a large auditorium.
BOOK: Happy Accidents: Serendipity in Major Medical Breakthroughs in the Twentieth Century
10.83Mb size Format: txt, pdf, ePub
ads

Other books

Liquid Compassion by Viola Grace
Bearded Women by Teresa Milbrodt
The Perfect Game by Sterling, J.
Vassa in the Night by Sarah Porter
The Cantaloupe Thief by Deb Richardson-Moore
A Charm of Powerful Trouble by Joanne Horniman
Forced to Submit by Cara Layton
A Long Pitch Home by Natalie Dias Lorenzi
The Woods by Harlan Coben