The sad fact is that some medical research is garbage. What you and I need to know is which studies are useless, manipulated data and which are groundbreaking, life-improving studies?
2 stories caught my eye today, both showing how research can be manipulated and twisted to “prove” an opinion or a hope.
First, I’m sorry to break the news to you, but chocolate does not accelerate weight loss in a diet. Many headlines around the world reaching millions of people reported the study that “proved” eating chocolate increased weight loss. The research was a deliberate hoax to test the media and several reporters and editors fell for it. How did the researcher and correspondent for Science Magazine, John Bohannon, fake the results? By using bad science on purpose. This is not the first time Bohannon has been in the news for medical deception designed to prove a point. Last year he ran a “sting” operation for Science by submitting bogus research articles to many of the new “pay for publication” medical journal sites. In his blog he wrote about how he easily skewed the results before he even started:
Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.
Think of the measurements as lottery tickets. Each one has a small chance of paying off in the form of a “significant” result that we can spin a story around and sell to the media. The more tickets you buy, the more likely you are to win. We didn’t know exactly what would pan out—the headline could have been that chocolate improves sleep or lowers blood pressure—but we knew our chances of getting at least one “statistically significant” result were pretty good.
There were many red flags some journalists and fact-checkers missed: The study lasted only 21 days; the participants self-reported their daily weigh-ins (of course, no one would lie about her weight); the published paper didn’t give key information such as how many people were in the study (only 15); the medical journal website that published the research paper, International Archives of Medicine, is not peer-reviewed or a top level publishing venue for research papers. (The paper itself, Chocolate with High Cocoa Content as a Weight Loss Accelerator, has since been taken down from the site.) In addition, a Google search for the “Institute of Diet and Health,” which sponsored the research, shows no website, while a Google search for “Johannes” Bohannon, Ph.D., listed as the lead researcher, has no matches (neither one existed).
To be fair, many more news outlets received the press release and did NOT report the study. Only a handful did, but among them were several outlets who normally publish high quality work. It’s disconcerting that some mainstream media were so easily misled.
Nutrition is notoriously difficult to study. In an op-ed piece in the New York Times Dean Ornish, founder of the nonprofit Preventive Medicine Research Institute, continues his public campaign promoting low-fat, high-carb diets. A Scientific American article by Melinda Wenner Moyer, Why Almost Everything Dean Ornish Says About Nutrition is Wrong, looked closely at the research Ornish referenced and found it to be not representative of what the much larger body of research indicates. Moyer doesn’t mince words, “it’s possible to cherry-pick observational studies to support almost any nutritional argument.”
Who is right in the Ornish-Moyer debate? I honestly don’t know. I would have to look at the research that each side quoted, read through as much I as I could access, and apply my own common sense. That can take hours to do. It’s frustrating that the truth is not always clear or even measurable.
In his blog, Bohannon quotes a nutrition expert:
‘Even the well-funded, serious research into weight-loss science is confusing and inconclusive,’ laments Peter Attia, a surgeon who cofounded a nonprofit called the Nutrition Science Initiative. For example, the Women’s Health Initiative — one of the largest of its kind — yielded few clear insights about diet and health. ‘The results were just confusing,’ says Attia. ‘They spent $1 billion and couldn’t even prove that a low-fat diet is better or worse.
The fact that these both happened to be food studies doesn’t change the conclusion that with medical research you must think for yourself. When you read a news report about medical research, here are the questions you should ask yourself:
- Does it make sense – both intuitively and have other studies found similar results?
- What is the size of the study sample? 35 people in a study is the smallest size that can be projected to a larger universe of people. The larger and broader demographic of the study group, the more likely a valid outcome. Corollary: Did a lot of people drop out? Why?
- What were the scientists looking for? The study should state what the initial goal was and if / why it changed during the research. A vague goal indicates fishing around for any benefit they could find.
- How long did the study last? The longer the effect is claimed for, the longer the study should be. Reality check: can one claim weight loss success after 21 days?
- Who’s paying for it? If a food council pays for a research study that shows medical benefits from using that food, look twice at the research. If a pharmaceutical company conducts research on its own drug, look carefully. Good research can be sponsored by the party that benefits, but the temptation to massage the results for financial gain is great.
Read studies, especially ones that have surprising results, with a skeptical eye. And, sorry, chocolate does NOT help you lose weight. It does, however, make me happy.