Scientists publish their findings. Then others use that information to develop and test new ideas. Society accrues knowledge incrementally through this process. Necessary obstacles arise on the path from results to publication. In the current system, some obstacles are slowing the overall influx of new science and simultaneously letting poor science through.
Peers must first evaluate the rigor of a study before it can be freely released into the scholarly literature. In their recent editorial, “Indexing the indices: scientific publishing needs to undergo a revolution”, Delzon, Cochard, and Pfautsch argue that the peer-review process has lost its ability to effectively and efficiently green-light additions to the primary literature. Delzon et al assert that this is a consequence of journals striving to raise their status (i.e., rankings against other journals, impact factor). The way in which journal impact is measured needs a serious overhaul, and Delzon et al think Google Scholar’s H5 index (equivalent to the Hirsch index) is just the tool for the job.
Instead of ranking the quality of a journal by the average number of citations received by its publications within the past five years (the traditional IF5 metric), the H5 index ranks a journal only by its top-cited publications. Papers not often cited (or not cited at all) won’t affect the H5 score either way. A switch to the H5 index doesn’t seem to change the current ranking of top journals (at least in plant science and chemistry, but see this other analysis). The strategy of H5 is advantageous because it doesn’t put pressure on editors to reject papers that they perceive to have little citation potential. If journals are more likely to accept papers (over 75% are currently rejected by top journals), authors are less hassled to re-submit multiple times, each time seeking an outlet with increasingly lower impact. New findings will then reach the scientific community (and maybe the public, if the journal is open access) at an appropriately rapid pace to advance science.
Most importantly, highlight Delzon et al, a switch to the H5 index will also lessen the burden on reviewers. In the current system, high rejection rates translate to more reviews of the same paper. Reviewers are called into action more frequently than is necessary, and ultimately sustainable, given that peer review is essentially a volunteer service to the scientific community. Over-taxed expert reviewers must decline more reviews, which forces journals to reach out to non-expert or inexperienced reviewers. Not properly vetted, unsound scientific findings then enter the scientific literature, an unfortunate result that undermines the basic tenet of the peer-review process. So, yes, it seems we are in need of a revolution in scientific publishing!
Further reading on journal impact and peer review: