The use of this chlorocarbon compound as an anesthetic had a number of advantages over ether: chloroform worked faster and smelled better, and less of it was required. As well, recovery from a procedure in which chloroform had been administered was faster and less unpleasant than from surgery using ether. The extreme flammability of ether was also a problem. It formed an explosive mixture with oxygen, and the smallest spark during a surgical procedure, even from metal instruments clanking together, could cause ignition.
Chloroform anesthesia was readily accepted for surgical cases. Even though some patients died, the associated risks were considered small. As surgery was often a last resort and as patients sometimes died from shock during surgery without anesthetics anyway, the death rate was deemed acceptable. Because surgical procedures were performed rapidlyâa practice that had been essential without anesthesiaâpatients were not exposed to chloroform for any great length of time. It has been estimated that during the American Civil War almost seven thousand battlefield surgeries were performed under chloroform, with fewer than forty deaths due to the use of the anesthetic.
Surgical anesthesia was universally recognized as a great advance, but its use in childbirth was controversial. The reservations were partly medical; some physicians rightly expressed concerns about the effect of chloroform or ether on the health of an unborn child, citing observations of reduced uterine contractions and lowered rates of infant respiration with a delivery under anesthesia. But the issue was about more than just infant safety and maternal well-being. Moral and religious views upheld the belief that the pain of labor was necessary and righteous. In the Book of Genesis women, as Eve's descendants, are condemned to suffer during childbirth as punishment for her disobedience in Eden: “In sorrow thou shalt bring forth children.” According to strict interpretation of this biblical passage, any attempt to alleviate labor pain was contrary to the will of God. A more extreme view equated the travails of childbirth with atonement for sinâpresumably the sin of sexual intercourse, the only means of conceiving a child in the mid- nineteenth century.
But in 1853 in Britain, Queen Victoria delivered her eighth child, Prince Leopold, with the assistance of chloroform. Her decision to use this anesthetic again at her ninth and final confinementâthat of Princess Beatrice in 1857âhastened the acceptance of the practice, despite criticism leveled against her physicians in
The Lancet,
the respected British medical journal. Chloroform became the anesthetic of choice for childbirth in Britain and much of Europe; ether remained more popular in North America.
In the early part of the twentieth century a different method of pain control in childbirth gained rapid acceptance in Germany and quickly spread to other parts of Europe. Twilight Sleep, as it was known, consisted of administration of scopolamine and morphine, compounds that were discussed in Chapters 12 and 13. A very small amount of morphine was administered at the beginning of labor. It reduced pain, although not completely, especially if the labor was long or difficult. Scopolamine induced sleep and, more important for the doctors endorsing this combination of drugs, ensured that a woman had no memory of her delivery. Twilight Sleep was seen as the ideal solution for the pain of childbirth, so much so that a public campaign promoting its use began in the United States in 1914. The National Twilight Sleep Association published booklets and arranged lectures extolling the virtues of this new approach.
Serious misgivings expressed by members of the medical community were labeled as excuses for callous and unfeeling doctors to retain control over their patients. Twilight Sleep became a political issue, part of the larger movement that eventually gained women the right to vote. What seems so bizarre about this campaign now is that women believed the claims that Twilight Sleep removed the agony of childbirth, allowing the mother to awaken refreshed and ready to welcome her new baby. In reality women suffered the same pain, behaving as if no drugs had been administered, but the scopolamine-induced amnesia blocked any memory of the ordeal. Twilight Sleep provided a false picture of a tranquil and trouble-free maternity.
Like the other chlorocarbons in this chapter, chloroformâfor all its blessings to surgical patients and the medical professionâalso turned out to have a dark side. It is now known to cause liver and kidney damage, and high levels of exposure increase the risk of cancer. It can damage the cornea of the eye, cause skin to crack, and result in fatigue, nausea, and irregular heartbeat, along with its anesthetic and narcotic actions. When exposed to high temperatures, air, or light, chloroform forms chlorine, carbon monoxide, phosgene, and/or hydrogen chloride, all of which are toxic or corrosive. Nowadays working with chloroform requires protective clothing and equipment, a far cry from the splash-happy days of the original anesthetic providers. But even if its negative properties had been recognized more than a century ago, chloroform would still have been considered a godsend rather than a villain by the hundreds of thousands who thankfully inhaled its sweet-smelling vapors before surgery.
Â
Â
There is no doubt that many chlorocarbons deserve the role of villain, although perhaps that label would be better applied to those who have knowingly disposed of PCBs in rivers, argued against the banning of CFCs even after their effects on the ozone layer were demonstrated, indiscriminately applied pesticides (both legal and illegal) to land and water, and put profit ahead of safety in factories and laboratories around the world.
We now make hundreds of chlorine-containing organic compounds that are not poisonous, do not destroy the ozone layer, are not harmful to the environment, are not carcinogenic, and have never been used in gas warfare. These find a use in our homes and industries, our schools and hospitals, and our cars and boats and planes. They garner no publicity and do no harm, but they cannot be described as chemicals that changed the world.
The irony of chlorocarbons is that those that have done the most harm or have the potential to do the most harm seem also to have been the very ones responsible for some of the most beneficial advances in our society. Anesthetics were essential to the development of surgery as a highly skilled branch of medicine. The development of refrigerant molecules for use in ships, trains, and trucks opened new trade opportunities; growth and prosperity followed in undeveloped parts of the world. Food storage is now safe and convenient with home refrigeration. We take the comfort of air-conditioning for granted, and we assume our drinking water is safe and that our electrical transformers will not burst into flames. Insect-borne diseases have been eliminated or greatly reduced in many countries. The positive impact of these compounds cannot be discounted.
17. MOLECULES VERSUS MALARIA
T
HE WORD
MALARIA
means “bad air.” It comes from the Italian words
mal aria,
because for many centuries this illness was thought to result from poisonous mists and evil vapors drifting off low-lying marshes. The disease, caused by a microscopic parasite, may be the greatest killer of humanity for all time. Even now there are by conservative estimates 300 million to 500 million cases a year worldwide, with two to three million deaths annually, mainly of children in Africa. By comparison, the 1995 Ebola virus outbreak in Zaire claimed 250 lives in six months; more than twenty times that number of Africans die of malaria each day. Malaria is transmitted far more rapidly than AIDS. Calculations estimate that HIV-positive patients infect between two and ten others; each infected malaria patient can transmit the disease to hundreds.
There are four different species of the malaria parasite (genus
Plasmodium
) that infect humans:
P. vivax, P. falciparum, P. malariae,
and
P. ovale.
All four cause the typical symptoms of malariaâintense fever, chills, terrible headache, muscle painsâthat can recur even years later. The most lethal of these four is falciparum malaria. The other forms are sometimes referred to as “benign” malarias, although the toll they take on the overall health and productivity of a society is anything but benign. Malaria fever is usually periodic, spiking every two or three days. With the deadly falciparum form this episodic fever is rare, and as the disease progresses, the infected patient becomes jaundiced, lethargic, and confused before lapsing into a coma and dying.
Malaria is transmitted from one human to another through the bite of the anopheles mosquito. A female mosquito requires a meal of blood before laying her eggs. If the blood she obtains comes from a human infected with malaria, the parasite is able to continue its life cycle in the mosquito gut and be passed on when another human supplies the next meal. It then develops in the liver of the new victim; a week or so later it invades the bloodstream and enters the red blood corpuscles, now available to another bloodsucking anopheles.
We now consider malaria to be a tropical or semitropical disease, but until very recently it was also widespread in temperate regions. References to a feverâmost probably malariaâoccur in the earliest written records of China, India, and Egypt from thousands of years ago. The English name for the disease was “the ague.” It was very common in the low-lying coastal regions of England and the Netherlandsâareas with extensive marshlands and the slow-moving or stagnant waters ideal for the mosquito to breed. The disease also occurred in even more northern communities: in Scandinavia, the northern United States, and Canada. Malaria was known as far north as the areas of Sweden and Finland near the Gulf of Bothnia, very close to the Arctic Circle. It was endemic in many countries bordering the Mediterranean Sea and the Black Sea.
Wherever the anopheles mosquito thrived, so did malaria. In Rome, notorious for its deadly “swamp fever,” each time a papal conclave was held, a number of the attending cardinals would die from the disease. In Crete and the Peloponnesus peninsula of mainland Greece, and other parts of the world with marked wet and dry seasons, people would move their animals to the high hill country during the summer months. This may have been as much to escape malaria from the coastal marshes as to find summer pastures.
Malaria struck the rich and famous as well as the poor. Alexander the Great supposedly died of malaria, as did the African explorer David Livingstone. Armies were particularly vulnerable to malaria epidemics; sleeping in tents, makeshift shelters, or out in the open gave night-feeding mosquitoes ample opportunity to bite. Over half the troops in the American Civil War suffered from annual bouts of malaria. Can we possibly add malaria to the woes suffered by Napoleon's troopsâat least in the late summer and fall of 1812, as they began their great push to Moscow?
Malaria remained a worldwide problem well into the twentieth century. In the United States in 1914 there were more than half a million cases of malaria. In 1945 nearly two billion people in the world were living in malarial areas, and in some countries 10 percent of the population was infected. In these places malaria-related absenteeism in the workforce could be as high as 35 percent and up to 50 percent for schoolchildren.
QUININE-NATURE'S ANTIDOTE
With statistics like these it is little wonder that for centuries a number of different methods have been used to try to control the disease. They have involved three quite different molecules, all of which have interesting and even surprising connections to many of the molecules mentioned in earlier chapters. The first of these molecules is quinine.
High in the Andes, between three thousand and nine thousand feet above sea level, there grows a tree whose bark contains an alkaloid molecule, without which the world would be a very different place today. There are about forty species of this tree, all of which are members of the
Cinchona
genus. They are indigenous to the eastern slopes of the Andes, from Colombia south to Bolivia. The special properties of the bark were long known to the local inhabitants, who surely passed on the knowledge that a tea brewed from this part of the tree was an effective fever cure.
Many stories tell how early European explorers in the area found out about the antimalarial effect of cinchona bark. In one a Spanish soldier suffering a malarial episode drank water from a pond surrounded by cinchona trees, and his fever miraculously disappeared. Another account involves the countess of Chinchón, Doña Francisca Henriques de Rivera, whose husband, the count of Chinchón, was the Spanish viceroy to Peru from 1629 to 1639. In the early 1630s Doña Francisca became very ill from malaria. Traditional European remedies were ineffectual, and her physician turned to a local cure, the cinchona tree. The species was named (although misspelled) after the countess, who survived thanks to the quinine present in its bark.
These stories have been used as evidence that malaria was present in the New World before the arrival of Europeans. But the fact that the Indians knew that the
kina
treeâa Peruvian word, which in Spanish became
quina
âcured a fever does not prove that malaria was indigenous to the Americas. Columbus arrived on the shores of the New World well over a century before Doña Francisca took the quinine cure, more than enough time for malarial infection to find its way from early explorers into local anopheles mosquitoes and spread to other inhabitants of the Americas. There is no evidence that the fevers treated by
quina
bark in the centuries before the conquistadors arrived were malarial. It is now generally accepted among medical historians and anthropologists that the disease was brought from Africa and Europe to the New World. Both Europeans and African slaves would have been a source of infection. By the mid-sixteenth century the slave trade to the Americas from West Africa, where malaria was rife, was already well established. In the 1630s, when the countess of Chinchón contracted malaria in Peru, generations of West Africans and Europeans harboring malarial parasites had already established an enormous reservoir of infection awaiting distribution throughout the New World.