Flexible New Research Framework Can Improve Reproducibility, Transparency

  • <<
  • >>

575484.jpg

 

Woohoo—you’ve finally submitted the paper you have worked on for years to your favorite peer-reviewed journal. Three months later, you receive a rejection letter—not for the content, but because you failed to provide collateral that was required. But wait, you didn’t need to provide that to the last journal you submitted work to, or the one before that… so why does this one need it?

Since a standard publishing framework does not exist, different journals have different guidelines. Existing guidelines exist, but often are specialist standards like ARRIVE, which covers animal research, and CONSORT, which is associated with clinical trial reporting. This has led to a fragmented scientific publishing landscape that has increased the burden on authors and editors alike.

Working with six renowned journals, a team from the University of Edinburgh and the Centre for Open Science has developed a new framework they say will harmonize the recording of outcomes and improve reproducibility, replication and transparency throughout the life sciences. The new Research Materials, Design, Analysis and Reporting (MDAR) Framework is outlined in a new publication in PNAS.

"Improving research is challenging; it requires ongoing effort, adapting to the changing demands and circumstances of the time,” said Malcolm Macleod, study author and academic lead for research improvement and research integrity at the University of Edinburgh. “No single intervention will be sufficient, but we hope the MDAR framework can contribute to the range of initiatives [that] support improvement."

The heart of the framework is its flexibility and easy adoption. As it is structured now, MDAR includes three separate outputs. The framework itself sets out minimum requirements and best practice recommendations across the four core areas that comprise its name. The optional checklist helps operationalize the framework by serving as an implementation tool to aid authors complying with journal policies and editors and reviewers in assessing reporting and adherence to the standard. Finally, the elaboration document is a user guide that provides context for the framework.

“In many ways, the MDAR Framework is a joint iteration on our collective previous experience with guidelines, checklists and editorial requirements toward a harmonized and practical guideline for minimum reporting requirements,” the researchers explain in their paper.

To gain valuable feedback on the MDAR Checklist and ensure its practicality, the team piloted the framework across 13 journals and 289 manuscripts. According to the paper, 80% of the 211 corresponding authors who responded to the team’s survey found the checklist “very” or “somewhat” helpful. Likewise, 84% of the 33 editorial staff members who participated in the pilot also found the checklist “very” or “somewhat” helpful.

A significant takeaway from the pilot program was that only 15 of the 42 Checklist items were considered relevant to more than half of the 289 manuscripts. While study participants suggested organizing the checklist in a nested way to allow users to easily skip over non-relevant sections, the authors concluded on a different approach given their focus on flexibility. The team said they decided the “best organization may be journal-specific,” and will leave the option as an implementation decision for those journals that elect to utilize the checklist.

Again in the name of flexibility, the research team settled on three increasingly stringent levels of implementation for journals—the recommended, the limited mandate and the full mandate.

While the ultimate goal is transparent journal reporting to improve research, the team envisions the MDAR Framework being used by other stakeholders in the life science community. They hypothesize it could be used by researchers to design, conduct, analyze and report studies; by institutions to teach best research practices; and by funders and others involved in research assessment to help evaluate rigor and reporting.

Lastly, the tool has the potential to be especially impactful in the ever-growing field of scientific pre-prints.

“With the growth in preprints in recent years, preprint servers are uniquely placed to collect (and post) an author-completed MDAR checklist, which could then travel with the manuscript when submitted to a journal,” the authors write.

The full set of MDAR resources is available in a collection on the Open Science Framework. The authors say it will be maintained and updated as a community resource, further evolving as time goes on and implementation rates rise.

“We have identified a basic minimum requirement as well as an aspirational set of best practices that provide directional guidance for the future evolution of the requirements. With time, we expect that elements currently identified as ‘best practice’ will instead be identified as a ‘basic minimum requirement’ as reporting performance improves,” the authors conclude.