The Reproducibility Project

Alex S. Woodell

As members of the biomedical research community, we are all familiar with the basic structure of the scientific method. We make an observation that identifies a problem, form a hypothesis based on known information, perform experiments to test this hypothesis, then draw conclusions that drive future research. However, there is a fifth component to this scheme that is often overlooked from this dogma: replication. Replication allows us to be confident that our findings are legitimate and provides a mechanism for scientists to hold each other accountable for the work they produce. This concept is critical for any scientist under the biomedical sciences umbrella, as many of these studies are geared directly toward improving patient outcomes and carry significant consequences.

In 2011, Bayer made headlines after releasing a paper in Nature Reviews Drug Discovery reporting that only 25% of preclinical studies could be replicated. This was followed by a separate 2012 Nature paper by Amgen which could only recreate the findings from 11% of projects analyzed. Together, these reports served to rally the biomedical community against the reproducibility crisis. However, neither of these disclosed the scrutinized studies or provided raw data from the repeat studies. In an effort to address the scope and cause of these findings, the Reproducibility Project: Cancer Biology (RPCB) was born.

The RPCB is a collaborative initiative between the Center for Open Science, a non-profit organization that promotes transparency, integrity, and reproducibility in the scientific research community, and Science Exchange, an online marketplace which allows scientists to outsource their research services. After its inception in 2013, the program set out to accomplish two primary aims: (1) assess reproducibility in preclinical cancer research and (2) identify the primary factors that affect reproducibility. The team chose 50 cancer biology papers from high-profile journals including Nature and Science and matched these with independent labs to perform replication studies that would later be published on eLife.

The team originally budgeted $26,000 per experiment, but this proved to be a gross underestimation. The cost of time-consuming peer reviews, materials-transfer agreements, and animal experiments drove this number up to an estimated $40,000 or more. As a result, the team stopped pursuing 10 of the studies involving animal experiments and an additional 3 due to minimal communication with the authors in late 2015. A second scale down was announced in January 2017, bringing the total number of papers down to 29.

Replication results for 10 of the cancer biology studies are now published on eLife. Six of these were mostly repeatable, two inconclusive, and two negative. Some argue that the contract labs which conducted the studies are at fault for the “negative” findings. One replication study, investigating a 2010 Science report that a peptide called iRGD helps chemotherapy drugs shrink prostate tumors in mice, completely failed to reproduce the results. However, labs in Germany, China, and Japan have successfully replicated their findings. The contract lab also used commercial iRGD in their study and conducted their experiments without first validating that the peptide was active prior to testing it against tumors. Such oversights bring into question the validity of these repeat experiments.

Two of the inconclusive studies involved growing implanted tumor tissue in mice. Tumor growth can be affected by a number of factors including mouse age, the number of cells implanted, or even surgical technique. It’s common for such complex experiments to require repeated iterations, even when the protocol is well-established, due to unforeseen differences between these variables. Unfortunately, the RPCB does not afford contract labs the ability to troubleshoot experiments per eLife policy. The team is currently finishing the write-up for 8 of the remaining 19 studies. Since the grant funding of this project expired at the end of 2017, the final 11 studies will not be completed in their entirety; the incomplete results of these will be published later this year.

So how do we fix an issue so deeply rooted in academic research? The RPCB highlights several key areas for improvement:

  1. Devil’s in the details – protocols should include as much information as possible about how an experiment was carried out; even seemingly irrelevant details can be key to replicating a study.
  2. Sharing is caring – original materials such as plasmids or proprietary cell lines should be deposited in life science repositories to reduce the amount of time required to generate these materials from scratch.
  3. Sound as a bell – biochemical assays and translational models should be independently validated within each lab to reduce random noise and produce sound results.

The solution to the reproducibility problem is not simple and will likely require concerted efforts from much of the scientific community at large. By the same token, initiatives such as the RPCB need to streamline the replication process to prevent cutbacks, while ensuring the validity of repeat studies by allowing contract labs to fine-tune experimental techniques and parameters accordingly.

Do you have ideas for how reproducibility can be enhanced in the biomedical sciences or standardized across academic research as a whole? Please let us know in the comments below.

References

Mullard, A. (2017). Cancer reproducibility project yields first results. Nat Rev Drug Discov, 16(2), 77.

Kaiser, J. (2017). Rigorous replication effort succeeds for just two of five cancer papersScience| AAAS18.

The Reproducibility Project: Cancer Biology

Maher, B. (2015). Cancer reproducibility project scales back ambitions. Nature, 2.

Photo Credits

Reality vs. appearances: Wood vs. lack of light, Ralf Steinberger

 

One thought on “The Reproducibility Project

Add yours

Leave a comment

Create a website or blog at WordPress.com

Up ↑