The Lucky Years: How to Thrive in the Brave New World of Health (25 page)

BOOK: The Lucky Years: How to Thrive in the Brave New World of Health
3.63Mb size Format: txt, pdf, ePub

I’m a big proponent for vaccination among those who are eligible. Vaccines are not just for children hoping to avoid the ills of yesteryear like polio and chicken pox. When the scientific data clearly and overwhelmingly prove that a particular vaccine can reduce the risk of contracting a certain illness dramatically with few to no side effects, then we as a society should demand universal inoculation (the only exception being those who cannot be vaccinated due to certain disorders or because they are immune compromised). Much in the way we prevent our kids from suffering from measles and diphtheria, we can prevent adults from experiencing serious—and sometimes fatal—afflictions that nowadays have inexpensive and painless antidotes in the form of a vaccine. These include the human papilloma virus, influenza, pneumonia, and shingles. This mandate alone can likely put a big dent in our rates of communicable diseases later in life, and, as a result, bring health care costs down significantly.

In this cartoon, James Gillray drew a scene at the Smallpox and Inoculation Hospital at St. Pancras, showing Jenner’s cowpox vaccine being administered to frightened young women, and cow parts emerging from the subjects’ bodies. The cartoon was inspired by the controversy at the time over administering materials from animals to humans.

The Nocebo Effect and the Limits of Nutritional Studies

In 2015, while attending the World Economic Forum in Davos, Switzerland, I found myself in an uncomfortable position while on a panel about nutrition called “Let Food Be Thy Medicine.” I was once again reminded of the limits and pitfalls of our own belief systems.

The panel was intended to explore how our daily diet and dietary habits can become a cornerstone of health. I got the conversation started with statements that most people probably didn’t want to hear. Nutrition is one of the thorniest topics in health; it tends to get underneath people’s skins. Everyone has an opinion on the subject and there is so much conflicting data in this area that the noise becomes deafening.

One question I like to raise is what exactly
is
nutrition? How can we define it? For someone like me, who tells two or three people a week that I have nothing left to give them to extend their lives, let alone cure them, it’s frustrating that we don’t have reliable data from nutrition that show how to prevent disease and live longer through specific dietary protocols. If we did, then the pendulum wouldn’t swing so wildly back and forth on what’s considered “good” for us. Just reflect on what you’ve experienced in your own life. Most of us have learned to view eggs and other animal products like red meat as infrequent indulgences due to their fat and cholesterol content. Americans have been told to choose a low-fat diet rich in whole grains and complex carbohydrates. But in the fall of 2014, experts on the government’s committee charged with setting dietary guidelines changed their tune. They acknowledged that there was “no appreciable relationship” between dietary cholesterol and blood cholesterol.
5
They also admitted that a low-fat diet may not be ideal.

What makes nutrition studies and their resulting policies so challenging is that they rely on observational examinations during which researchers follow large groups of people over long periods, keeping track of what they are eating and what the outcomes are. This is not, however, the best kind of science. They often base their findings on questionnaires in which people rely on their memories to recall their dietary and other choices. But think about that: if you’re enrolled in a study and asked about how many times you raided the cookie jar or visited a fast-food restaurant in the past week or month, how honest and reliable can you be? Even the most rigorous observational studies can only produce results that show associations, not causation. In other words, such studies can suggest hypotheses (e.g., saturated fats are
linked to heart disease), but fail to really prove anything (e.g., saturated fats
cause
heart disease).

So in the end, there really is no firm data in this department. Just a lot of generalized statements and anecdotal evidence: “I went on a gluten-free diet and felt amazing,” “I went Paleo and lost forty pounds,” “I cut out sugar and grew stronger nails and longer hair.” While I think we all can agree that a diet high in processed sugars and fats won’t do anyone good, it’s hard to make scientifically proven conclusions about all the nuances of nutrition. I do have hope that one day we’ll be able to conduct nutrition studies that can rule out all the variables and offer reliable recommendations. Even then, however, those recommendations may not be for everyone. That’s why I’m a big believer in taking nutrition into your own hands and doing what the great medieval philosopher Maimonides suggested: try everything and see how certain foods and dietary protocols make you feel. Listen to your body and follow its signals. There may never be a right and wrong way to eat. Diet is contextual. Keep in mind that people have been eating various diets based on their cultures for thousands of years—diets that have sustained populations and allowed them to live good, long lives. Only in modern Western civilization do we incessantly search for the heroes and villains in our dietary habits and label them as such.

Although we have some evidence that a Mediterranean diet, for example, is associated with a reduced risk of death from heart disease and cancer, as well as a lower incidence of Parkinson’s and Alzheimer’s diseases, that’s not to say other diets are therefore bad or increase the risk of these ailments. Some traditional populations in Africa, such as the Maasai people in Kenya and Tanzania, live primarily off raw milk and raw blood and occasional meat from cattle. And—surprise!—they have low incidence of the diseases typically attributed to a high-fat, high-cholesterol diet, such as heart disease and cancer. So how do we reconcile that fact in our modern arguments about the ills of meat, dairy, and not having a diverse diet?

The gluten debate is a particularly intriguing one. Countless people swear by a gluten-free diet, which avoids the foods in which the protein
composite gluten is found: wheat, rye, barley, and other grains. Gluten is what gives bread its delicious chewiness. Gluten-free is a big industry. Sales of gluten-free products are estimated to hit $15 billion in 2016, and nearly a third of Americans try to avoid the ingredient. These people believe that gluten causes intestinal distress even in the absence of celiac disease, an autoimmune disorder triggered by gluten that affects only 1 percent of Americans. Celiac patients must avoid gluten, but why is everyone else jumping on this bandwagon? Is there really such a thing as non-celiac gluten sensitivity?

Peter Gibson of Monash University in Australia is one of the researchers who first documented non-celiac gluten sensitivity in a small study he published in 2011.
6
But after careful thinking, Gibson wasn’t satisfied with his results. He wondered, as I would, how a compound so ubiquitous in any diet today could be cause for serious concern for so many people. So Gibson returned to the drawing board and upped the ante on his experiments. He took his next experiment to the extreme, which isn’t something we normally see happen in nutrition studies. He attempted to validate—or invalidate—his previous finding.

What he did was clever.
7
He took 37 people who claimed to be gluten sensitive and had irritable bowel syndrome (IBS) and provided them with meals that eliminated ingredients associated with gastrointestinal distress in some people. These potential triggers included lactose (from milk products); certain preservatives like propionate, sulfites, and nitrites; and fermentable, poorly absorbed short-chain carbohydrates (technically known as fermentable oligo-di-monosaccharides and polyols, or FODMAPs). He also collected nine days’ worth of urine and poop, presumably to ensure no one cheated. The individuals cycled through different diets—high-gluten, low-gluten, and no-gluten—but they didn’t know which diet plan they were on at any given time. And guess what: all of the diets—even the gluten-free diet—allegedly caused gas, bloating, pain, and nausea to a similar degree. It didn’t matter if the diet contained gluten. In the words of Gibson: “In contrast to our first study . . . we could find absolutely no specific response to gluten.”
8
Although this was also a small study, another larger one published later on confirmed the findings.

How do we explain this unexpected result? This is where the science gets interesting. It could be that people
expected
to feel worse on the study’s diets, so they did—a phenomenon called the “nocebo” effect, a wordplay on the placebo effect. After all, they did have to pay close attention to how their tummies felt, which alone might entail some psychosomatic response. Moreover, it’s been suggested that gluten may be the wrong villain and that these other potential triggers, especially the FODMAPs, are to blame. These ingredients often travel with gluten. It may in fact be the carbohydrate component rather than the gluten part of the wheat that is causing symptoms. Other constituents of wheat might also be problematic for some people. That said, it still doesn’t explain why people in the study reacted negatively to diets that were free of all dietary triggers. Which is why sweeping, absolute statements like “Gluten is always bad” or “Organic is always good” miss the point, especially when it comes to preventing illnesses. The difference between good and bad revolves around individual context; what is right for each person is a highly individualized matter, especially in light of the fact that too much of the science about nutrition does not always hold up.

I’m all for prevention. It’s the best way today to avoid the illnesses that can take our lives prematurely, especially cancer. But we can’t make blanket statements about the superior power of prevention through dietary strategies alone, to the exclusion of other therapies. A new buzzword has crept into the lay vernacular:
lifestyle medicine
. This term refers to using basic lifestyle interventions—nutrition, exercise, supplementation, stress management—to prevent and sometimes even treat disease rather than pharmaceutical drugs to do so. But here’s the thing: this conversation gets corrupted by the same black-and-white thinking that plagues many other areas of health. There are lots of ways to reach a goal, and those ways should be distinct to each individual.

The Hazards of Broad, Sweeping Statements

Given my views on this subject, you can imagine my reaction on that World Economic Forum panel when my colleague Dean Ornish, a physician and researcher based in San Francisco who is focused on preventive medicine mostly through diet, said bluntly: “Drugs and surgery don’t work nearly as well as we once thought.”
9
He was speaking within the context of treating heart patients, but then went on to say more specifically that diabetes and heart disease drugs don’t work in certain situations, and he referred to one of his own studies that concluded “intensive lifestyle interventions” can positively affect the progression of early-stage prostate cancer. I immediately protested against his broad assertions, which frankly made me look terrible—like the bad guy who prefers drugs from Big Pharma. But Ornish’s 2005 study shows how declaratory statements, especially from a small study such as this one, can be misleading and cause confusion.

I should preface this story with the fact that Ornish’s original claim to fame was the reversal of heart disease through lifestyle changes. But cancer is an entirely different animal. So when he broached my subject area in our panel discussion, my ears pricked up. The purpose of Ornish’s twelve-month study was to examine the impact of changes in diet and lifestyle to men with early-stage, low-grade prostate cancer.
10
All of the 93 men in the experiment had chosen not to undergo an active treatment. Due to the low-risk nature of their prostate cancer, they opted for a “watchful waiting” (which is now called “active surveillance”) approach instead of conventional treatment such as surgery, drugs, and/or radiation. Active surveillance, which is now routine for early-stage prostate cancers, means following the disease and doing repeat biopsies on a regular basis, and only treating the disease if it progresses.

The men were randomly assigned either to an experimental group that was asked to make comprehensive lifestyle changes or to a “usual care” control group. “Usual care” meant following their doctor’s advice in terms of general lifestyle changes. However, the study was designed to decrease the likelihood that control group patients would make diet
and lifestyle changes comparable to those of the experimental group, thereby making the results difficult to interpret. There were 44 men in the experimental group and 49 men in the control. The actual lifestyle intervention taken by the 44 men was the following:

• vegan diet supplemented with soy (1 daily serving of tofu plus 58 grams of a fortified soy protein powdered beverage)

• fish oil (3 grams daily)

• vitamin E (400 IU daily)

• selenium (200 micrograms daily)

• vitamin C (2 grams daily)

Other books

The Loner: Crossfire by Johnstone, J.A.
Ten Lords A-Leaping by Ruth Dudley Edwards
Bite Me by Shelly Laurenston
The Champion by Scott Sigler
Secret Pleasures by Cheryl Howe
A Touch of Infinity by Howard Fast
The Swiss Family Robinson by Johann David Wyss