THAT’S THE WAY THE COOKIE CRUMBLES (3 page)

BOOK: THAT’S THE WAY THE COOKIE CRUMBLES
8.85Mb size Format: txt, pdf, ePub
For Some, a Diet Goes against the Grain

Just ask people what they worry about most in their food supply and they’ll round up the usual suspects. Their thoughts will drift to nitrites, sulfites, food colors, artificial sweeteners, monosodium glutamate, or genetically modified organisms. Yet we are far more likely to be harmed by a commonly occurring natural component in food than by any of these. Gluten, a protein found in wheat, barley, rye, and — to some extent — oats, can provoke health problems in a significant percentage of the population. Celiac disease, as gluten intolerance is usually called, may be much more common than we think.

Dr. Samuel Gee of Britain was the first to provide a clinical description of the disease. In 1888, he painted a disturbing picture of young children with bloated stomachs, chronic diarrhea, and stunted growth. Dr. Gee thought that the condition could have a dietary connection, so he put his young patients, for some strange reason, on a regimen of oyster juice. This proved to be useless. Willem K. Dicke, a Dutch physician, finally got onto the right track when he made an astute observation during World War II. The German army had tried to starve the Dutch into submission by blocking shipments of food to Holland, including wheat. Potatoes and locally grown vegetables became staples, even among hospital patients, and Dicke noted that his celiac patients improved dramatically. Moreover, in the absence of wheat and grain flours, no new cases of celiac occurred.

By 1950, Dicke had figured out what was going on. Gluten, a water-insoluble protein found in wheat, was the problem. As later research showed, the immune systems of celiac patients mistake a particular component of gluten, namely gliadin, for a dangerous invader, and they mount an antibody attack against it. This triggers the release of molecules called cytokines, which in turn wreak havoc upon the villi — the tiny, fingerlike projections that line the surface of the small intestine. The villi are critical in providing the large surface area needed for the absorption of nutrients from the intestine into the bloodstream.

In celiac disease, the villi become inflamed and markedly shortened, and their rate of nutrient absorption is effectively reduced. This has several consequences. Nonabsorbed food components have to be eliminated, and this often results in diarrhea. Bloating can also occur when bacteria in the gut metabolize some of these components and produce gas. But the greatest worry for the celiac disease sufferer is loss of nutrients. Protein, fat, iron, calcium, and vitamin absorption can drop dramatically, and this results in weight loss and a plethora of complications. Luckily, if the disease is recognized and a gluten-free diet initiated, the patient can lead a normal life.

Diagnosis of celiac disease involves the physician taking a biopsy sample from the duodenum, the uppermost section of the small intestine, via a gastroscope passed down through the patient’s mouth. Microscopic analysis shows the damaged villi. Recently, blood tests have also become available. One of these screens for the presence of antigliadin antibodies, but it is not foolproof. Only about half the patients with positive results actually show damaged villi upon biopsy. The antitissue transglutaminase test (anti-tTG) is a much better diagnostic tool, but it is available only in specialized labs.

There is a great deal of interest in these tests because of their potential value in identifying celiac cases and perhaps even in screening the population. Celiac disease, which has a genetic component, does not necessarily begin immediately after gluten is first introduced into the diet. The onset of disease can occur at any age. In adults, the symptoms are usually much less dramatic than they are in young children. The first signs are often unexplained weight loss and anemia due to poor iron and folic acid absorption. A patient’s stools tend to be light-colored, smelly, and bulky because of unabsorbed fat. Symptoms can include a blister-like rash, joint and bone pain, stomachache, tingling sensations, and even headaches and dizziness. Identification of celiac patients is important not only because much of their misery can be prevented by a gluten-free diet, but also because a recent study showed that over a thirty-year period the death rate among celiac patients was double that expected in the general population. Delayed diagnosis and poor compliance with diet increase the risk. The major cause of death among celiac disease sufferers is non-Hodgkin’s lymphoma, a type of cancer known to be associated with celiac disease. A less severe but more common complication than cancer is osteoporosis, brought on by poor absorption of calcium and vitamin D.

Unfortunately, a gluten-free diet is not that easy to follow. Wheat and barley crop up in a wide assortment of products. Patients have to become veritable sleuths; they quickly discover that foods as diverse as ice cream, luncheon meats, ketchup, chocolate, and even communion wafers can contain gluten. Fortunately, the Celiac Association distributes excellent information on dietary dos and don’ts, and consumers can now purchase a large assortment of gluten-free products based on rice, corn, and soy.

The plan of action for biopsy-diagnosed celiacs is clear. They must adhere religiously to a gluten-free diet in order to eliminate symptoms and reduce the risk of osteoporosis and lymphoma. But what about people who have no overt symptoms yet have a positive blood-test result? Surveys indicate that one in about two hundred people may fall into this category. The biopsies of some may show normal villi initially, but these people are considered to have latent celiac disease, which may become symptomatic years later. Others may have flat villi without symptoms, a condition referred to as silent celiac disease, which can become aggressive at any time. Should these people follow a preventative, difficult-to-maintain diet? At this point, nobody really knows, since we still have much to learned about the effects of gluten. Recently, for example, researchers discovered that celiac patients who complained of headaches showed brain inflammation on MRI scans, and that the problem resolved on a gluten-free diet. Certain individuals have provided anecdotal and controversial evidence that the condition of some autistic children improves when gluten is eliminated from their diets. However, there is no evidence that these children have celiac disease.

So, on the one hand, it certainly seems that we have not yet uncovered all of gluten’s potential mischief. On the other hand, one intriguing method of reducing gluten exposure has emerged. Preliminary research suggests that it may be possible to remove gluten’s offensive component by genetically modifying wheat. That would be a boon to celiac patients and perhaps even to those who may be suffering in silence.

Saccharin: From Back Alley to Tabletop

“Psst . . . I’ve got some of the good stuff here,” the shadowy figure in the back alley whispered. Word spread quickly, and soon people were scurrying from everywhere to purchase little bags of the white powder. They tasted it carefully, hoping for just the right sensation. But it was not euphoria they were after, it was sweetness. All they were looking for was a personal supply of contraband saccharin. Then life would be sweet once more.

In 1902, the German government passed legislation prohibiting the sale of saccharin to healthy consumers. Pharmacies could still sell the sweetener to diabetics, but others would have to make their strudel the old-fashioned way — with sugar. If they could afford it. The saccharin ban had nothing to do with safety concerns; it was the result of lobbying efforts by the powerful beet-sugar industry. Saccharin was cheap to produce and had quickly become the “sugar of the poor.” Fearing the loss of tax revenue from the sugar manufacturers, many European governments agreed to curb the sale of saccharin. This gave rise to a huge black market, amply supplied by producers in Switzerland, where saccharin remained legal. It wasn’t until World War I that a sugar shortage brought saccharin back into favor in Europe. It rapidly became the prime sugar substitute, a status it has retained to this day. But most people who sweeten their lives with saccharin have no idea of the fascinating story that lies behind its discovery.

It all started the day a young American named Ira Remsen noticed a bottle sitting on a table in his doctor’s office. Most people would not have given it a second glance, but to Ira, that bottle of nitric acid presented an unexpected opportunity to clear up the meaning of a confusing phrase he had come across in his chemistry book. What did “nitric acid acts upon copper” really mean? Here was his chance to find out. Digging in his pockets, he found a couple of copper pennies. Working quickly, Ira poured a little of the acid over the coins, and, to his amazement, this produced an immediate reaction. “A green-blue liquid foamed and fumed over the cent,” he later recalled. “The air in the neighborhood of the performance became dark red. A great colored cloud arose. This was disagreeable and suffocating!”

When Ira tried to fling the mess out the window, he discovered that nitric acid acts not only upon copper, but also upon fingers. Wiping his burning fingers on his trousers, he made another discovery: nitric acid also acts upon fabrics. And so it was that Ira Remsen learned the meaning of the common chemical expression “acts upon.” But he learned something else as well. He learned that the only way to understand chemical action is to experiment and witness the results — in other words, to work in a laboratory. On that day back in the 1860s, Ira Remsen decided to dedicate his life to the fledgling field of chemistry.

However, a career in chemistry was not what Ira’s parents had in mind for their son. They wanted him to be a doctor. So Ira dutifully worked for, and achieved, a doctorate in medicine. But as soon as he’d done that, he left his home in the United States and went to Germany to pursue his real love. With a doctorate in chemistry in hand, he returned to the U.S. to take up a position as professor of chemistry at Williams College. The job was a frustrating one, because, unlike their German counterparts, American colleges did not make research a priority. Remsen was forced to concentrate on teaching, but he taught with great enthusiasm. He resolved that his students would experience and understand chemical reactions, starting, of course, with the one between nitric acid and copper. The reddish vapor that filled the air was due to the formation of nitrogen dioxide, he explained, and the bluish residue was copper nitrate. Remsen also wrote a widely acclaimed textbook, which introduced students to the principles of chemistry. His increasing renown won him the chemistry chair at a new university that had been founded in Baltimore with an endowment from a local merchant by the name of Johns Hopkins.

Remsen jumped at the opportunity to get in on the ground floor and establish a program that would focus on research. Indeed, Johns Hopkins University soon became the center for the study of chemistry in the United States, attracting students and researchers from around the world. One of these was Constantine Fahlberg, a German chemist who wanted to continue his studies under Remsen. The project Remsen assigned him was not a particularly exciting one. This wasn’t surprising, since Remsen was interested in science as a form of higher culture, not as a practical tool to solve problems. He asked Fahlberg to study the oxidation of certain coal-tar derivatives known as toluene sulfonamides, simply because no one had done this before. Fahlberg, it seems, was a pretty sloppy chemist — he often didn’t bother to wash his hands before leaving the laboratory. His sloppiness, though, turned into a stroke of luck.

At dinner one evening, Fahlberg noticed that the slice of bread he had picked up tasted unusually sweet. It didn’t take him long to figure out what had happened. He traced the sweetness to a substance he had been handling in the laboratory, and he immediately brought this chance discovery to the attention of Remsen. In 1880, the two scientists published their finding in
The American Chemical Journal
, noting that the new compound was hundreds of times sweeter than sugar. Remsen looked upon this as a mere curiosity, but Fahlberg immediately saw the potential for commercial exploitation. He knew that sugar prices fluctuated greatly, and that a low-cost sweetening agent would be most welcome. Those on a weight-loss regime, Fahlberg thought, would also find the new product appealing. The product would dramatically lower the calorie count of sugar-sweetened foods, since it was so sweet that only a tiny amount would produce the desired sweetness. Fahlberg coined the term “saccharin” for his discovery, after the Latin word for sugar, and he secretly patented the process for making it. Within a few years, saccharin became the world’s first commercial nonnutritive sweetener, and it made Fahlberg a wealthy man.

Remsen did not resent the fact that neither he nor Johns Hopkins University ever made a dime from saccharin. He was a pure scientist at heart and did not much care whether his research turned out to be financially profitable. But he did develop an intense dislike for Fahlberg, who, by all accounts, tried to take sole credit for the discovery. “Fahlberg is a scoundrel,” Remsen often said, “and it nauseates me to hear my name mentioned in the same breath with him!” But due to the importance of the saccharin discovery, their names will be forever linked. The importance is twofold: first, the commercial production of saccharin is the earliest example of a technology transfer from university research to the marketplace; second, and more importantly, saccharin introduced the concept of a nonnutritive sweetener, an idea that has been mired in controversy from the moment it was first raised.

Saccharin went into commercial production in Germany, where Fahlberg had taken out a patent. It wasn’t until 1902 that John Francis Queeny, a former purchasing agent for a drug company in St. Louis, decided to take a chance on manufacturing saccharin in the United States. Here the sweetener was not burdened by any of the legal problems that were arising in Europe. He borrowed fifteen hundred dollars and founded a company that at first had only two employees — himself and his wife. Queeny decided to give the company his wife’s maiden name, and Monsanto was born. At first, the company’s only product was saccharin, but it quickly diversified to become one of the largest chemical companies in the world.

The unfettered use of saccharin in America did not last long, however — thanks mostly to the work of Dr. Harvey W. Wiley, who, in 1883, was made chief of the Department of Agriculture’s Bureau of Chemistry. The bureau had been created to monitor the safety of the food supply when the population began to increase dramatically after the Civil War, resulting in large-scale changes in food production. People were moving into the cities from farms, and they no longer ate every meal at home or prepared every meal from scratch. A burgeoning food industry was gearing up to meet the demand for prepared foods and the preservatives needed to make them safe. Wiley had become concerned about the unregulated use of such food additives, an issue to which he had become sensitized during his days as a professor of chemistry at Purdue University. In 1881, he had published a paper on the adulteration of sugar with glucose, and he had looked into the problem of coloring cheese with lead salts. Now Wiley worried about the extensive use of formaldehyde, benzoic acid, and boric acid as preservatives. Since they poisoned bacteria and molds, could they also poison humans? He decided to find out.

The hallmark of Wiley’s crusade for safer food was the establishment of the celebrated Poison Squad. Wiley recruited twelve healthy young men, asking them to meet every day for lunch and dine on foods prepared with a variety of additives. If the men developed any unusual symptoms, Wiley would move for a ban of the additive. In retrospect, this was a primitive system, because it revealed nothing about exposure to small amounts of chemicals over the long term. Still, Wiley’s work publicized the need for food regulation, and his efforts finally culminated in the passage of the Pure Food and Drugs Act of 1906, which for the first time gave the government some teeth with which to bite food and drug adulterers.

Dr. Wiley became a food-safety zealot, and he caught saccharin up in the net he cast to catch chemical culprits. He vigorously attacked saccharin as a “coal-tar by-product totally devoid of food value and extremely injurious to health.” Unfortunately for Wiley, President Theodore Roosevelt had been prescribed the sweetener by his physician, and he loved the stuff. “Anyone who says saccharin is injurious to health is an idiot,” Roosevelt proclaimed, and he decided to curtail Wiley’s authority. The president established what he called a “referee board of scientists” — ironically, with Ira Remsen as its head — to scrutinize Wiley’s recommendations. The board found saccharin to be safe but suggested that its use be limited to easing the hardship of diabetics. That suggestion had no legal bearing, and it was soon forgotten in the face of massive industry maneuvering to satisfy the public’s demand for nonnutritive sweeteners.

The saccharin bandwagon rolled happily along until 1977, when a Canadian study suggested an increased incidence of bladder cancer in male rats fed the equivalent of eight hundred diet drinks a day — male rats whose mothers had been dosed with the same amount of saccharin. Though the study was ridiculed by saccharin promoters as irrelevant to human subjects, the Canadian government banned saccharin as a food additive but allowed its continued use as a sweetener that consumers added themselves. The U.S. Food and Drug Administration (the descendant of Wiley’s bureau) also proposed a ban, but a massive public outcry prompted Congress to permit the sale of saccharin pending further studies. Its continued use as an additive was allowed, but a warning label on the familiar little pink packets stating that saccharin was known to cause cancer in laboratory animals was mandated.

Subsequent research failed to clear saccharin of all blame as a carcinogen, but human epidemiological studies have shown that if there is any risk at all, it is a very small one. In fact, in 2000 the U.S. government removed saccharin from its official list of human carcinogens, and President Clinton signed a bill eliminating the requirement for a warning label on the product. Canada still does not allow saccharin as an additive. But you don’t have to purchase it stealthily in back alleys. You can buy it legally in pharmacies. Diabetics are certainly grateful for that.

Ira Remsen, I’m sure, could never have imagined where his little experiment with nitric acid would eventually lead. And why was there nitric acid in the doctor’s office in the first place? Because in those days, silver nitrate was used as an antiseptic and was generated by nitric acid “acting upon” silver.

Other books

Broken Trails by D Jordan Redhawk
The Garden of Evil by David Hewson
Ghost Writer by Margaret Gregory
Time Tunnel by Murray Leinster
Identity by K. J. Cazel