My peer review principles & practices

Commitment to transparent, open, and credible peer review
preprints
science communication
peer review
open review
open evaluation
Author
Affiliation
Published

2025-02-03

In this entry, I outline my approach to evaluating scientific outputs based on the principles of transparency and openness.1 I also include my template responses to review invitations.

Background and principles

I signed/joined the PRO Initiative way back when I was a PhD student, but just to remind myself:

Openness and transparency are core values of science. As a manifestation of those values, a minimum requirement for publication of any scientific results must be the public submission of materials used in generating those results. As reviewers, it is our responsibility to ensure that publications meet certain minimum quality standards.

We therefore agree that as reviewers, starting 1 January 2017, we will not offer comprehensive review for, nor recommend the publication of, any manuscript that does not meet the following minimum requirements. Once such a manuscript has been certified by the authors to meet these minimum requirements, we will proceed with a more comprehensive review of the manuscript.

PRO Initiative; Morey et al. (2016)

More recently I’ve been entertaining the idea of joining/signing something similar but regarding open assessment—the practice of i. evaluating openly available works and ii. making the evaluations themselves public. For example, I find Nikolaus Kriegeskorte’s Open Evaluation proposal very agreeable:

“The current system of scientific publishing provides only journal prestige as an indication of the quality of new papers and relies on a non-transparent and noisy pre-publication peer-review process, which delays publication by many months on average. Here I propose an OE [Open Evaluation] system, in which papers are evaluated post-publication in an ongoing fashion by means of open peer review and rating. […] OA [Open Access] and OE together have the power to revolutionize scientific publishing and usher in a new culture of transparency, constructive criticism, and collaboration.

Kriegeskorte (2012)

The scientific enterprise relies on access to accurate information. One of the ways in which scientists have tried to ensure that information is accurate is the process of peer-review, where experts look at your work and evaluate whether it’s up to snuff. While the primary fruits of the peer-review process (the manuscripts) are increasingly openly available, the reviews (and editorial notes) are typically not. This creates a situation whereby consumers of the scientific literature must trust the peer-review process without the being able to evaluate and learn from the evaluations themselves.

Many have suggested that this closed approach to evaluation might be suboptimal (Kriegeskorte 2012; Holcombe 2025). Moreover, the peer reviews themselves can contain information that could be widely applicable outside the specific review context. Therefore, I am taking the following steps to increase my engagement with open assessment of scientific research:

Practices

  • I review (and edit) for outlets that implement open evaluation, such as PCI: Registered Reports
  • I make my reviews publicly available (e.g. on PREreview, my blog, etc.)
  • I adhere to the PRO Initiative’s transparency and openness guidelines
  • I acknowledge that e.g. privacy reasons may require deviating from these guidelines

Template responses

Here’s some boilerplate text that I use in my responses to review invitations.

When no preprint exists

Thank you for considering me as a reviewer. I was not able to find a publicly available version of this manuscript, and so will tentatively decline your request. If you can point me to the publicly available manuscript, or if the authors make the manuscript publicly available, I would be happy to provide my signed review which I will also post publicly on PREreview (https://prereview.org/profiles/0000-0001-5052-066X) under a CC-BY 4.0 license to ensure it is permanently available and citeable.

This approach aligns with my commitment to rigorous, open, transparent, and citeable peer review of publicly available scientific work. (see e.g. Kriegeskorte, 2012 “Open Evaluation: A Vision for Entirely Transparent Post-Publication Peer Review and Rating for Science”). (If a preprint already exists, I apologize for missing it and would be happy to review it if you can provide a link to it.) Please let me know if you have any questions about this process.

When a preprint exists

Thank you for considering me as a reviewer. I am happy to provide my signed review which I will also post publicly on PREreview (https://prereview.org/profiles/0000-0001-5052-066X) under a CC-BY 4.0 license to ensure it is permanently available and citeable.

This approach aligns with my commitment to open science and transparent evaluation (see e.g. Kriegeskorte, 2012 “Open Evaluation: A Vision for Entirely Transparent Post-Publication Peer Review and Rating for Science”). Please let me know if you would prefer to not have me upload a public review, or if have any questions about this process.

Open data/materials

When data/materials are not shared or transparently cited (see https://www.opennessinitiative.org/guidelines-for-action-editors-and-reviews/) I will communicate to the editor that

I believe strongly in the value of openness and transparency. Please ask the authors on my behalf whether they can certify that they have met the standards of the Peer Reviewers’ Openness Initiative (https://opennessinitiative.org/).

PRO Initiative; Morey et al. (2016)

If a resubmission doesn’t meet the basic PRO requirements, I will communicate that

I cannot recommend this paper for publication, as it does not meet the minimum quality requirements for an open scientific manuscript (see https://opennessinitiative.org/). I would be happy to review a revision of the manuscript that corrects this critical oversight.

PRO Initiative; Morey et al. (2016)

Conclusion

There is no conclusion. How we conduct, communicate, and evaluate scientific research is and always will be a work in progress. This document simply outlines my modest attempts at keeping up with (what I perceive to be) the latest gold-standard practices in transparent communication and evaluation.

Further reading

Some valuable background reading on these topics can be found in Ahmed et al. (2023); Aleksic et al. (2015); Eisen et al. (2020); Holcombe (2025); Kathawalla, Silverstein, and Syed (2021); Kriegeskorte (2012); Morey et al. (2016); Moshontz et al. (2021); Sever (2023); Silverstein et al. (2024); Syed (2024). Silverstein et al. (2024) might be especially relevant when communicating these ideas to editors.

Feature image credit: https://undraw.co/.

Feedback & comments

I’d appreciate any feedback on these ideas/practices; feel free to le me know what you think either using the comments field (below) or on Bluesky:

References

Ahmed, Abubakari, Aceil Al-Khatib, Yap Boum, Humberto Debat, Alonso Gurmendi Dunkelberg, Lisa Janicke Hinchliffe, Frith Jarrad, et al. 2023. “The Future of Academic Publishing.” Nature Human Behaviour, July, 1–6. https://doi.org/10.1038/s41562-023-01637-2.
Aleksic, Jelena, Adrian Alexa, Teresa K. Attwood, Neil Chue Hong, Martin Dahlö, Robert Davey, Holger Dinkel, et al. 2015. “An Open Science Peer Review Oath.” January 9, 2015. https://doi.org/10.12688/f1000research.5686.2.
Eisen, Michael B, Anna Akhmanova, Timothy E Behrens, Diane M Harper, Detlef Weigel, and Mone Zaidi. 2020. “Implementing a "Publish, Then Review" Model of Publishing.” eLife 9 (December): e64910. https://doi.org/10.7554/eLife.64910.
Holcombe, Alex O. 2025. “Scientists! What Are You Supporting?” Alex Holcombe’s blog. January 31, 2025. https://alexholcombe.wordpress.com/2025/01/31/scientists-what-are-you-supporting/.
Kathawalla, Ummul-Kiram, Priya Silverstein, and Moin Syed. 2021. “Easing Into Open Science: A Guide for Graduate Students and Their Advisors.” Edited by Eunike Wetzel. Collabra: Psychology 7 (1): 18684. https://doi.org/10.1525/collabra.18684.
Kriegeskorte, Nikolaus. 2012. “Open Evaluation: A Vision for Entirely Transparent Post-Publication Peer Review and Rating for Science.” Frontiers in Computational Neuroscience 6. https://doi.org/10.3389/fncom.2012.00079.
Morey, Richard D., Christopher D. Chambers, Peter J. Etchells, Christine R. Harris, Rink Hoekstra, Daniël Lakens, Stephan Lewandowsky, et al. 2016. “The Peer Reviewers Openness Initiative: Incentivizing Open Research Practices Through Peer Review.” Royal Society Open Science 3 (1): 150547. https://doi.org/10.1098/rsos.150547.
Moshontz, Hannah, Grace Binion, Haley Walton, Benjamin T. Brown, and Moin Syed. 2021. “A Guide to Posting and Managing Preprints.” Advances in Methods and Practices in Psychological Science 4 (2): 25152459211019948. https://doi.org/10.1177/25152459211019948.
Sever, Richard. 2023. “Biomedical Publishing: Past Historic, Present Continuous, Future Conditional.” PLOS Biology 21 (10): e3002234. https://doi.org/10.1371/journal.pbio.3002234.
Silverstein, Priya, Colin Elman, Amanda Montoya, Barbara McGillivray, Charlotte R. Pennington, Chase H. Harrison, Crystal N. Steltenpohl, et al. 2024. “A Guide for Social Science Journal Editors on Easing into Open Science.” Research Integrity and Peer Review 9 (1): 2. https://doi.org/10.1186/s41073-023-00141-5.
Syed, Moin. 2024. “Valuing Preprints Must Be Part of Responsible Research Assessment.” Meta-Psychology 8 (March). https://doi.org/10.15626/MP.2023.3758.

Footnotes

  1. Obviously I also review the manuscripts on their content, but that is not the topic of this post.↩︎

Reuse

Citation

BibTeX citation:
@online{vuorre2025,
  author = {Vuorre, Matti},
  title = {My Peer Review Principles \& Practices},
  date = {2025-02-03},
  url = {https://vuorre.com/posts/open-peer-review/},
  langid = {en}
}
For attribution, please cite this work as:
Vuorre, Matti. 2025. “My Peer Review Principles & Practices.” February 3, 2025. https://vuorre.com/posts/open-peer-review/.