You have to know what is real before you know what is relevant. In science we trust that published means “real”. We may debate the impacts of different conditions and ooze skepticism over significance measurements and choose, sometimes less than tactfully, to object to an author’s conclusions, but overall we trust the data are real for the experimental conditions and that repetition of the experiment under the same conditions would yield the same outcome.
However, in a Catch 22 the data on reproducibility disagree.
In a collection of articles in the open source Nature special issue on data reproducibility, the most optimistic studies show 25 percent reproducibility of academic research. One article from Glenn Begley and Lee Ellis led to the development of Begley’s Six Rule for Reproducibility, which include blinded studies, full disclosure of all results, and reagent validation. Based on these rules, the optimistic 25 percent of reproducible studies are probably coming from far less than 25 percent of total academic labs. I know very few researchers that can honestly read all six rules without a little shame.
On July 9th, Oxbridge Biotech Roundtable (OBR) hosted a panel discussion on this issue of data reproducibility. The panel included Dr Liz Silva, the MIND Program Manager at UCSF and the former Senior Editor at PLoS ONE; Dr Tim Gardner, founder of Riffyn; Dr William Gunn, Head of Academic Outreach at Mendeley; and Dr Corey Goodman, a partner with venBio.
Goodman opened the conversation with the first question he asks of founders pitching their science to him: “who has reproduced this?” The rest of the discussion focused on the problems, like an increasing rate of article retraction, and potential causes, such as intense pressure to publish, funding shortages, and increased oversight. Somewhat frustratingly, very few solutions were offered. Gardner made a compelling argument for an incentivized approach to solving the problem. He suggested that two factors hold back scientists from solving the reproducibility issue: our culture and our lack of tools. We operate in “a culture of noise” in biology and accept that irreproducibility is unavoidable. However, if we employed adequate documentation tools similar to other industries it could greatly diminish the issue. Through incentives, like lower costs and timesavings by eliminating unnecessary, faulty repetitions, we can develop and adopt new tools and change the culture. (Gardner discusses this more in the Podcast referenced later.) When asked about automation as another tool in this solution, the panel felt it might be in the future, but currently there are other more pressing issues to address. (This could also be a reflection of the fear of the outdated bench scientist, but that is a topic for another day.)
Another potential, but more dogmatic approach, is a mandated NIH requirement for reproducibility standards. However, this reeks of additional strain on an already stretched system and more delays in publication, which will only compound the stresses that contribute to irreproducibility in the first place.
More adaptable and agile solutions are being developed through the Reproducibility Initiative. This partnership between Science Exchange, Mendeley, PLoS, and Figshare was briefly touched upon, but has the potential to revolutionize the way research is conducted as we incorporate tools, such as those being developed by Riffyn, and commercial replication services become cheaper and more convenient.
Overall this is a discussion that needs to continue and it requires total engagement from the entire scientific community. There is no clear solution but isn’t this the kind of problem we live for?
(To solicit more thoughts on solutions and the topic in general from two of the panelists, Silva and Gardner, I was asked to conduct an impromptu interview for a Podcast. By impromptu, I mean this interview was conducted on about five minutes notice. I even forget Liz’s last name! Anyway, it is clear I am not ready for the evening news. As soon as the Podcast is up I will add the link here, so stay tuned!)