Centre of Expertise Urban Vitality

Is the future of peer review automated?


The rising rate of preprints and publications, combined with persistent inadequate reporting practices and problems with study design and execution, have strained the traditional peer review system. Automated screening tools could potentially enhance peer review by helping authors, journal editors, and reviewers to identify beneficial practices and common problems in preprints or submitted manuscripts. Tools can screen many papers quickly, and may be particularly helpful in assessing compliance with journal policies and with straightforward items in reporting guidelines. However, existing tools cannot understand or interpret the paper in the context of the scientific literature. Tools cannot yet determine whether the methods used are suitable to answer the research question, or whether the data support the authors’ conclusions. Editors and peer reviewers are essential for assessing journal fit and the overall quality of a paper, including the experimental design, the soundness of the study’s conclusions, potential impact and innovation. Automated screening tools cannot replace peer review, but may aid authors, reviewers, and editors in improving scientific papers. Strategies for responsible use of automated tools in peer review may include setting performance criteria for tools, transparently reporting tool performance and use, and training users to interpret reports.

Reference Schulz, R., Barnett, A., Bernard, R., Brown, N. J. L., Byrne, J. A., Eckmann, P., Gazda, M. A., Kilicoglu, H., Prager, E. M., Salholz-Hillel, M., Ter Riet, G., Vines, T., Vorland, C. J., Zhuang, H., Bandrowski, A., & Weissgerber, T. L. (2022). Is the future of peer review automated? BMC Research Notes, 15, Article 203. https://doi.org/10.1186/s13104-022-06080-6
Published by  Urban Vitality 1 January 2022

Publication date

Jan 2022


Robert Schulz
Adrian Barnett
René Bernard
Nicholas J. L. Brown
Jennifer A. Byrne
Peter Eckmann
Małgorzata A. Gazda
Halil Kilicoglu
Eric M. Prager
Maia Salholz-Hillel
Timothy Vines
Colby J. Vorland
Han Zhuang
Anita Bandrowski
Tracey L. Weissgerber


Research database