Suffering from Slow Science Publishing

Monday, May 23, 2016

Information is instant. I remember the moment when this fact of the digital age was made unequivocally clear to me. It was the summer of 2014 in Montreal and I felt my room shake. A truck? Thunder? An earthquake? An earthquake. The news was up on Twitter faster than it seemed possible to type.

Yet, as the internet breathes immediacy into almost all forms of communication, the dissemination of biological findings has remained embarrassingly slow. Research routinely takes years to be shared, hindering the speed at which science progresses.

It took an average of six years for students graduating from UCSF between 2012 and 2014 to publish their scientific research, up from 4.7 years in the 1980s.

This means that an initial discovery from the first year or two of a graduate student’s project might not be shared with the global scientific community for another four years.

While work being done now has the potential to significantly impact future research directions, withholding potentially valuable information impedes scientific progress.

So, how is biological research shared and why is this process so slow?

Biologists publish research articles in scientific journals, through a process that has changed little since the advent of scientific journals in the 17th century.

Two main periods contribute to the time it takes to disseminate biological findings: 1. pre-submission, the time during which the research experiments are done, and 2. post-submission, the period during which the researchers negotiate with the journal before the article is accepted for publication.

During the pre-submission stage, biological research is performed primarily by graduate students and postdoctoral fellows working in a laboratory under the direction of a faculty member.

Generally, each laboratory member works on their own research project, gathering data over the course of a few years to describe a previously unknown scientific phenomenon.

Publishing this story is essential to career progression. Prospective or current employers, funding agencies, and colleagues assess the productivity and competency of scientists by reviewing their publication records.

Scientists are rewarded for publishing in prestigious and selective journals. These journals only publish articles that they believe will make a large impact in the field, so it is necessary for scientists to produce stories these journals deem to be impactful.

In the post-submission stage, journals decide whether the article is worthy of publication there.

To do this, most journals utilize a process known as anonymous peer review, wherein journal editors ask other researchers to anonymously critique submitted papers.

Reviewers can reject the article outright or request further experiments that often take months or longer to complete.

Authors carry little clout in this interaction and are generally obliged to perform all further experiments requested by reviewers. It is not uncommon for the post-submission period — the time of submission to final publication — to be over a year.

Many blame this lengthy post-submission peer review process for slowing down publication. However, this delay may be warranted: in many cases, further experiments are necessary to ensure the quality of the results presented and conclusions drawn.

For example, reviewers might ask authors to perform additional control experiments to rule out alternative explanations of the data.

Rather, there is a less obvious and more pervasive consequence of the current system of publishing that significantly delays the sharing of findings: papers contain more data than ever before.

As more scientists vie to publish their work in top tier journals, these journals have responded by publishing only the most complex and mature stories that explain a larger piece of the scientific unknown.

A more harmful side of the peer review process has contributed to this trend. Rather than rejecting papers based on sloppy science, reviewers criticize papers for having an insufficiently developed story and request experiments to expand the project past its original stopping point.

With no way to publish small pieces of data, reviewers must ask for all relevant data concerning a project to be included in the original publication or those data may never come out. In the short term, these experiments significantly increase the length of the post-submission period.

Over time, the standard size for a publication increases. This, in turn, increases the length of the pre-submission period as scientists are forced to include more data in their initial submission or risk being outright rejected by journals.

This system of publishing has significant consequences for research and research sharing.

First, the original findings must wait for the maturation of the entire story before they can be shared with the world.

Second, researchers are disincentivized from working on projects that might yield only small stories or stories to which they can only minimally contribute due to limitations on time, resources, or expertise, despite their potential value.

Third, these small stories that do exist are withheld from the scientific record, wasting time and resources of future groups who would benefit from that knowledge.

Accordingly, the published record of biological discovery is often years behind actual discovery and is missing work that was deemed too small to publish. This system is obviously broken.

Surely, reading “complete” stories is convenient. However, no story is ever really complete. Science is iterative. Articles build upon articles.

The current standard of “completion” for publication is arbitrary; there is no reason it cannot change. We just have to want to change it.

In considering change, it is important to consider why we publish scientific data. Eclipsed by the imperatives of selling journal subscriptions and advancing careers, there exist humble yet earnest motives for the work that biologists do.

We work to build a scientific understanding of the biology that defines and surrounds us and to share that knowledge with others.

Often, this work leads to solutions for pressing difficulties society faces like disease, food insecurity, or environmental abuse.

No project will solve these issues alone. Rather, scientists learn from each other’s work and build upon it, iteratively moving us towards resolutions. Publishing all data in a timely manner will accelerate this process.

By paying more attention to prestige than speed of disseminating results, we are failing.

But a change might be coming — at least in some fields of biological research. The field of epidemic research has acknowledged the crucial role rapidly accessible data plays in preventing the spread of disease and treating those already affected and has pushed to change its publishing practices.

Recent papers published on the Zika virus currently devastating South America are brief — containing only small amounts of data — and processed quickly once submitted to a journal.

The impetus for this type of publishing in active epidemics came from a consortium of funders and journals lead by the World Health Organization in response to the Zika virus that have called for the rapid dissemination of all sound research findings related to public health emergencies.

I can’t figure out why this model shouldn’t apply to all scientific data. Is cancer less of an emergency?

In the eyes of journals, reviewers, and scientists, it seems so. The most recent articles published on cancer are full-length stories that likely took years from the original finding to be “ready” to submit and then spent months in the peer review process. In fact, all research justifies a sense of urgency.

Science is interdisciplinary. Basic research informs and enables research on diseases or climate change.

Scientists, reviewers, and journals have a responsibility to other scientists, funders, patients, and society to share scientifically sound research — all of it — quickly.

Where do we go from here?

A new group of journals, ScienceMatters, is confronting this issue head on. It is founded on the principle of publishing individual observations. It provides a medium for sharing small amounts of stand-alone data that aren’t shareable in the current system or multi-year projects in smaller pieces, significantly decreasing the pre-submission time.

In addition, ScienceMatters aims to decrease the time from submission to publication by focusing on the scientific integrity of the work and eliminating the emphasis on the greater story, which should limit the scope of reviewer requests.

Looking at the observations already published in ScienceMatters, this seems to have worked as their average submission to publication time is about one month, with times ranging from one day to four months.

But what about stories? Stories provide a framework through which we can interpret a collection of findings to make conclusions about biological phenomena. ScienceMatters recognizes this need for stories and is providing a platform for publishing narrative integrations of single observations previously published within its content.

While the efforts of ScienceMatters are exciting and commendable, drastically altering the entrenched status quo of how information is shared in biology will not be easy, particularly when the enterprise of biological research is set up to reward publications in prestigious journals.

This reality means that prestigious journals have an important role to play in accelerating publication. Journals, reviewers, and authors must collectively acknowledge the incongruity of the current publication system with the greater good and work together to decrease the amount of data required for publication.

While large economic disincentives for early adopters make the greater good a difficult motivator, a critical mass can shift publishing practices to create a system that better serves scientific advancement and societal betterment.