If biology has an Indiana Jones, it is Christopher Ramsden: he specializes in excavating lost studies, particularly those with the potential to challenge mainstream, government-sanctioned health advice.It is well known that there is a strong publication bias which skews our knowledge and interpretation of things. Science publications (and general newspapers) have a strong propensity to publish research with strong and unexpected claims and scientists have a strong propensity to submit only studies with strong and unexpected claims.
His latest excavation — made possible by the pack-rat habits of a deceased scientist, the help of the scientist’s sons, and computer technicians who turned punch cards and magnetic tape into formats readable by today’s computers — undercuts a pillar of nutrition science.
Ramsden, of the National Institutes of Health, unearthed raw data from a 40-year-old study, which challenges the dogma that eating vegetable fats instead of animal fats is good for the heart. The study, the largest gold-standard experiment testing that idea, found the opposite, Ramsden and his colleagues reported on Tuesday in BMJ (formerly the British Medical Journal).
Although the study is more than just another entry in the long-running nutrition wars — it is more rigorous than the vast majority of research on the topic — Ramsden makes no claims that it settles the question. Instead, he said, his discovery and analysis of long-lost data underline how the failure to publish the results of clinical trials can undermine truth.
Absent a time machine, it’s impossible to know how publication of the study, conducted in Minnesota from 1968 to 1973, might have influenced dietary advice. But in an accompanying editorial, Lennert Veerman of Australia’s University of Queensland concluded that “the benefits of choosing polyunsaturated fat over saturated fat seem a little less certain than we thought.”
If one study finds that there is a strong effect (regardless of the issue), it is likely to be published. If nine other scientists run similar studies and find no effect, their work tends to go into the file cabinet, which is essentially what happened in the above reported case. The consequence is that the field sees one positive study and no contradicting studies and then, incorrectly, concludes that the effect is real.
Of course human desires and expectations come into this. It takes courage to buck the received wisdom. Institutions who grant funding are notoriously reluctant to go out on a limb. Those conducting the investigation are rarely agnostic about the outcomes. Lots of good reasons why science doesn't proceed as quickly and as effectively as it might, particularly where it comes to fundamental issues about complex human systems in which many stakeholders have vested interests.
Still, this is an interesting example of all those issues in play.