More than a third of the world’s top scientific journals do not publish reviews of their articles, and many more place strict limits on such reviews, according to a new in-depth survey.
The global assessment looked at 330 journals – the 15 highest-ranked titles in each of 22 scientific disciplines – and only counted 207 that accepted some type of post-publication comment.
And of the journals that accepted them, 67% imposed length-based limits on all published responses, while 32% set deadlines for accepting submissions.
The survey projectPosted in Royal Society Open Sciencewas compiled by a team of researchers from the United States, United Kingdom, Germany, the Netherlands and Australia, who described their work as identifying a major hole left in the system of peer-based scientific integrity.
“Overall,” the authors write, “post-publication criticism appears to be tightly controlled and restricted by top-ranked academic journals.”
In part, said one of the authors, Tom Hardwicke, a researcher in the psychology department at the University of Amsterdam, the policies reflect lingering attitudes from when journals focused on their print versions.
But now, with online formats the most dominant mode of publication, journals have little excuse to ban or limit by length or expiration date any published reviews of the research they have published, said Dr. Hardwicke.
“It seems that academic journals, rather than facilitating a healthy culture of criticism, often implicitly or explicitly suppress it,” he said. Times Higher Education.
His other co-authors include John PA Ioannidis, a professor of medicine at Stanford University known for his 2005 article in PLoS“Why Most Published Research Findings Are Wrong”, exposing the idea of a replication crisis in academic science.
Their new article offers several suggestions for improvement, aimed at ensuring that post-publication review becomes a strong complement to the more limited set of participants in standard pre-publication peer review processes.
“One idea I would particularly like to see journals try,” Dr. Hardwicke said, “is to hire an independent post-publication editor who has delegated responsibility for handling post-publication criticism, corrections and retractions.”
This kind of editorial independence seems necessary, he said, because journals have an inherent conflict of interest when dealing with reviews of articles they have already published.
From its review of 22 academic disciplines, Dr. Hardwicke’s team identified the field of clinical medicine as having the most active culture of post-publication criticism. The 15 journals in this field accepted post-publication reviews and published the most post-publication reviews overall.
Yet the world’s top clinical medicine journals also impose the publishing industry’s strictest limits on the length and deadline for submission of critical responses to articles, with a median word count limit of 400. compared to 400 to 550 words elsewhere, and a response time of four weeks compared to four to six weeks in other fields.
There is no justification for time limits on a legitimate and productive response to a published journal article, Dr. Hardwicke said. And in a predominantly online environment, length limits also seem unnecessary, he argued.
“Concise writing is of course important,” he said, “but strict length restrictions can also limit both coverage and the quality of the review.”