The Cardinal Sins of Skewed Research, Part 5: Burning Britches

1
ByDrs. Michael and Mary Dan EadesApril 29, 2019

In the last of our series on sleight-of-hand maneuvers that skew research and pollute the body of scientific literature, we point the finger at britches-burning deception in the form of outright fraud.

Scientific fraud can take many forms: using specimen pictures from other studies; substituting Western blot images from old research; fudging data; filling in or deleting outliers; or just plain reporting conclusions utterly at odds with the data collected. All such methods produce research that is suspect because the resulting conclusions have not been reached using the scientific method—not to mention the doubt cast by the outright deception. Many instances of this type of misbehavior are doubtless driven by the publish-or-perish dictum. Below are a few astounding instances.

Dr. Brian Wansink, a well-known professor at Cornell, sowed the seeds of his own destruction in late 2016 with an honest blog post (now removed) about how he helped a foreign graduate student generate a handful of papers from one data set. The title of the post, “The Grad Student Who Never Said ‘No,’” described Wansink’s efforts to shepherd a willing graduate student through the process of slicing and dicing data that apparently didn’t add up to much into something “worthy” of publication.

At each step in the paper-generating process, Wansink gave one of his paid post-doctoral fellows the opportunity to hash the data around. The postdoc repeatedly refused, which inspired the title of the post.

Of the unpaid graduate student’s efforts, Wansink wrote:

When she arrived, I gave her a data set of a self-funded, failed study which had null results (it was a one month study in an all-you-can-eat Italian restaurant buffet where we had charged some people ½ as much as others). I said, “This cost us a lot of time and our own money to collect. There’s got to be something here we can salvage, because it’s a cool (rich & unique) data set.” I had three ideas for potential Plan B, C, & D directions (since Plan A had failed). I told her what the analyses should be and what the tables should look like. I then asked her if she wanted to do them.

Every day she came back with puzzling new results, and every day we would scratch our heads, ask “Why,” and come up with another way to reanalyze the data with yet another set of plausible hypotheses. Eventually we started discovering solutions that held up regardless of how we pressure-tested them. I outlined the first paper, and she wrote it up, and every day for a month I told her how to rewrite it and she did.

It would be surprising if there were no hypothesis driving the experiment when this data was collected. If the data had confirmed the hypothesis, a paper would certainly have already been in publication before this grad student arrived. The fact that so much work went into making any ensuing papers publishable (helped no doubt in great measure by having Wansink’s prestigious name attached) indicated how tenuous the connection was between the experimental data and anything meaningful. Yet, the “findings” of this experiment made their way into five journal articles in five different publications.

When Wansink published the blog post, the reaction was swift and overwhelmingly negative.  The first response from a fellow academic set the tone: “Brian – Is this a tongue-in-cheek satire of the academic process or are you serious? I hope it’s the former.” Others’ comments were worse.

Wansink tried to explain his actions in terms of mentoring his students through the intricacies and difficulties of the tenure-attainment process. His responses made it apparent that he was clueless as to the scientific malfeasance inherent in such behavior.

As the outrage mounted, Wansink found himself fighting for his academic life—a fight he ultimately lost. Based on his explanations of how he had encouraged his grad student to use highly questionable methods of data analysis to secure publication, researchers started going through all his work and found multiple problems. At last count, 15 of his papers had been retracted. He is no longer involved in teaching or research at Cornell and is scheduled for early retirement this year.

Wansink’s story is far from unique, but because of the degree of fame he possessed, his story made the news. The tenure track is brutal, and publish or perish is the order of the day. It appears that Wansink was trying his best to mentor his charges through the process, and scientific rigor simply fell by the wayside. How many others are out there doing the same thing?

In 2001, Dr. Piero Anversa, a cardiac researcher at Harvard Medical School, riveted the scientific community with the publication of a study in which he showed that heart muscle could be regenerated using stem cells from bone. Building upon these early findings, Anversa found that the heart produced its own stem cells, which could be removed, grown in medium, and injected back into the heart to repair damage. His work was truly astonishing and prompted startup industries to develop stem-cell replacement for heart damage and generated numerous grants from the NIH for further study. The problem was, no one could replicate his work. His response was to extol his own laboratory expertise and tell other researchers that, basically, they just didn’t have his touch.

Ultimately, another researcher named Dr. Jeffery Molkentin, who had been trying to replicate Anversa’s experiments, developed his own methods of tracing the genetic material in the cells that had allegedly transformed into the healthy heart cells. When he checked, he found that Anversa’s results were not possible. The publication of Molkentin’s results in 2014 spelled doom for Dr. Anversa, who subsequently had 31 of his papers retracted.

Stem-cell research is apparently fertile ground for fraudulent research. In 2014, Haruko Obokata, a stem-cell biologist at the prestigious Riken Center for Developmental Biology in Kobe, Japan, published two papers in the journal Nature on a simple new method for creating stem cells. Like Anversa’s papers, these quickly captured the attention of the scientific community, but within a short time, it became apparent that the experiments had been sloppily done and the data falsified. Nature retracted both papers six months after publication.  Tragically, Obokata’s supervisor and the co-author of the two papers, Dr. Yoshiki Sasai, subsequently committed suicide.

These are three instances of highly regarded researchers in prestigious institutions falsifying or torturing data to make it prove something it really doesn’t. If this kind of academic misbehavior takes place under the watchful glare of top-level research facilities, how much goes unnoticed in less prestigious places?

Physicians today are inculcated with the noble admonition to practice evidence-based medicine. Evidence-based medicine is medicine practiced in accordance with the principles grounded in good science. Science advances via publication in the scientific literature. If published research findings are tainted by sleights of hand that include outright fraud, how can physicians effectively practice evidence-based medicine?


Additional Reading


Drs. Michael and Mary Dan Eades are the authors of 14 books in the fields of health, nutrition, and exercise, including the bestseller Protein Power.

Dr. Michael Eades was born in Springfield, Missouri, and educated in Missouri, Michigan, and California. He received his undergraduate degree in engineering from California State Polytechnic University and his medical degree from the University of Arkansas. After completing his medical and post-graduate training, he and his wife, Mary Dan, founded Medi-Stat Medical Clinics, a chain of ambulatory out-patient family care clinics in central Arkansas. Since 1986, Dr. Michael Eades has been in the full-time practice of bariatric, nutritional, and metabolic medicine.

Dr. Mary Dan Eades was born in Hot Springs, Arkansas, and received her undergraduate degree in biology and chemistry from the University of Arkansas, graduating magna cum laude. After completing her medical degree at the University of Arkansas, she and her husband have been in private practice devoting their clinical time exclusively to bariatric and nutritional medicine, gaining first-hand experience treating over 6,000 people suffering from high blood pressure, diabetes, elevated cholesterol and triglycerides, and obesity with their nutritional regimen.

Together, the Eades give numerous lectures to the general public and various lay organizations on their methods of treatment. They have both been guest nutritional experts on over 150 radio and television shows, including national segments for FOX and CBS.

Comments on The Cardinal Sins of Skewed Research, Part 5: Burning Britches

1 Comments

Comment thread URL copied!
Back to 190430
Matthieu Dubreucq
January 23rd, 2020 at 1:15 am
Commented on: The Cardinal Sins of Skewed Research, Part 5: Burning Britches

Allow a negative outcome to be the outcome. Not every study has to prove the original idea (or worst plan B, C or D).

The idea of registering what you are looking for before to do the study is also potentially an avenue.

Comment URL copied!