Skip to content

PLOS is a non-profit organization on a mission to drive open science forward with measurable, meaningful change in research publishing, policy, and practice.

Building on a strong legacy of pioneering innovation, PLOS continues to be a catalyst, reimagining models to meet open science principles, removing barriers and promoting inclusion in knowledge creation and sharing, and publishing research outputs that enable everyone to learn from, reuse and build upon scientific knowledge.

We believe in a better future where science is open to all, for all.

PLOS BLOGS PLOS Biologue

Openly Streamlining Peer Review

We are delighted to host our first guest post on Biologue  by James Rosindell and William D. Pearse  from Silwood Park, Imperial College London. They share their view on how we might improve peer review.

Elsevier recently announced a peer review challenge. Open to all, the competition invites contestants to submit a 600-word idea of how to improve the peer review system. Winning entries are soon to be announced on Elsevier’s website. Unfortunately, the terms and conditions of entry involve making your idea the property of Elsevier, so we fear that even those who don’t win will no longer have the right to implement their idea or make it public. Since having our idea known is more important to us than winning a prize we decided to make our entry publically available here rather than enter the challenge. In this way, we hope that our ideas will contribute to the debate on this subject and to the implementation of some of our suggestions.

James Rosindell (left) and Will Pearse

Computing and the Internet have revolutionised access to scientific work over the last 20 years, while the peer review system has remained relatively unchanged; in essence, it is optimised for a pre-Internet world despite incorporating obvious features such as online submission. It would be remarkable if the existing peer review system remained the optimal one for the 21st century now that swift, international, free, paperless exchange of information makes new ideas easier to disseminate, free of barriers such as the cost of printing and distribution. There may be resistance to moving away from a ”local optimum” – a state where any small change appears to be a step in the wrong direction but the correct large change may yield huge improvements. There may also be questions over what the new system should be, but we are glad to see that Elsevier acknowledges the need to consider alternatives through their challenge. We do not pretend that our idea is a complete solution, but we hope to contribute to the necessary discussion over what peer review should be. We believe that an effective and fair reviewing system is needed to support a rapidly growing scientific community and body of knowledge.

Our Entry

Peer review is an essential part of science, but there are problems with the current system. Despite considerable effort on the part of reviewers and editors it remains difficult to obtain high quality, thoughtful and unbiased reviews, and reviewers are not sufficiently rewarded for their efforts. The process also takes a lot of time for editors, reviewers and authors.

We believe that these problems are inter-related. A fairly rewarded reviewer should do a better job, and thus make the editors’ task easier. The publishing process is already expensive, and scientists value their reputations, so rewarding reviewers with scientific status rather than with money seems the natural solution to us. One way would be to make reviews signed, published online, and citable. This in turn would make the process less frustrating for authors: a scientific discussion is useful and exciting, but a debate with an anonymous opponent who has nothing to gain or lose apart from time is not.

We propose a new system for peer review. Submitted manuscripts are made immediately available online.

peer review cartoon
Cartoon by Nick D Kim, strange-matter.net (please see site for terms of reuse)

Commissioned and/or voluntary reviews would appear online shortly afterwards. The agreement or disagreement of other interested scientists and reviewers are automatically tallied, so editors have a survey of general opinion, as well as full reviews, to inform their decisions. Far from being unrealistic, a similar system already exists on the web: Reddit, which is used to rank webpages based on public opinion, and has nearly 100 million views per day (alexa.com). Reddit incorporates useful features such as karma, which indicates the overall reception of an individual’s posts.

In our proposed system, users would log into the system and get the opportunity to vote once for each article (or reviewers comment), thereby moving it up or down the rankings. Access could be restricted to those within the academic world or even within an appropriate discipline, so only appropriately qualified individuals could influence the rankings. The publication models of established journals would be preserved, as full publication of an article can still take place once the journal is satisfied with the scientific community’s reception of the work. Specialists would have immediate and free access to the cutting edge of science, while the wider community would still benefit from the filtering function of peer review. User biases regarding author identity or subject area could be automatically detected, and users can then be given the chance to defend their views openly. Conflicts of interest could be declared and taken into account when calculating rankings. Complete openness would mean no one need fear retribution for stating their scientific position. Unfair treatment would be clear for all to see, and not hidden behind the walls of anonymity created by the current system. If a journal rejects a paper, it would remain online along with its reviews and its ranking. Authors could submit a revision to a different journal along with a link to the earlier version, so reviewers comments and users opinions are retained for further use, significantly increasing the system’s efficiency.

Ultimately, it may be possible for journals to approach authors for their unpublished manuscripts based on the online reviews and rankings. Indeed, since it is possible to cite articles in online repositories, journals agreeing to publish these works could immediately claim all pre-existing citations to contribute to their impact factor. A recent simulation suggests that a system where editors bid for articles increases everyone’s number of publications and thus speeds the advancement of science (Allesina, 2009). Many of these changes could be introduced incrementally. Publishing reviews and using the existing (open-source) Reddit code to rank scientific work would be straightforward, and could yield significant benefits to everyone involved in publishing science.

References: Allesina, S. (2009) arXiv:0911.0344v1

This post was originally submitted to PLOS Biology before the submission deadline for Elsevier’s competition. It has not materially changed since that date.

The authors declare they have no competing interests.

Back to top