Systems thinking to better understand the scientific process
Linkages between review, dissemination and assessment
As an early career scientist, I am glad to come across a platform like OpenUpHub, which can guide us sailing through the rough waters of the Devil’s Triangle between review, dissemination and assessment. When I think about these three issues, as I inevitably do every day, I remember my very first publication in a peer-reviewed journal, for which I had to go through three rounds of harsh review, identity crises and agonizing revisions. When it was finally accepted a year after the initial submission, I was feeling accomplished and proud to publish in the most prestigious journal of my field, naïve to grasp what an impact factor lower than 1 means even though the journal is well-regarded. I did not have a Twitter, ResearchGate or Mendeley account back then to post the publication, and I did not bother myself even to circulate it in emails.
I later came to the understanding that the harsh reviews on this manuscript would be among the best I have ever received, since the reviewers showed a very good understanding of the subject, apparently spent a big deal of their valuable time on careful evaluation and offered concrete solutions to strengthen the manuscript. Therefore, when I wrote “We are thankful for the fruitful comments of the reviewers and we believe that the manuscript has significantly benefited from their review.”, which is a sentence I copied and pasted to every response letter I have written since then, I really meant it. However, the paper has received very few citations so far, pushing it to the dark and forgotten corners of the academic literature.
Assuming that the citation score of a paper is a good indicator of its assessment by the scientific community, many studies give useful hints about increasing it. For instance, the more revisions a manuscript goes through, the more cited it is1. The shorter the title of a paper, the higher the citation score2. International collaborations bring more citations3. Many studies remain uncited in the first few years after their publication, yet they receive attention later on4,5. Such studies are based on correlations and surely do not imply causation. However, they highlight the variety of factors that affect the assessment of scientific output, from the review process to natural delays in the system, and the interwovenness between the preparation, presentation, dissemination and the impact of research outputs.
To deal with the complexity this interwovenness creates, the most urgent change I would like to see in how research is reviewed, disseminated or assessed is a switch to systems thinking from addressing each of these issues individually. Several suggestions have been made to improve the review process6, or several tools have been developed as listed by OpenUpHub. Similarly, several alternative assessment criteria have been proposed that go beyond citation scores and altmetrics. However, we do not yet have a complete picture of how these different aspects of scientific process are related to each other; and how other pressing issues from gender equality to job insecurity and mental health influence the scientific process.
OpenUpHub is a great initiative in my opinion to serve as a platform where all these aspects and issues are brought together. Its compilation of tools and resources provides a very valuable reference to scientists, publishers, librarians and policy makers. A further step I would like to see is a systems map that depicts the interlinkages between the review, dissemination and assessment processes, as well as the other pressing issues. Such a map that describes the relationships, underlying mechanisms, and feedback loops could help all actors involved, from scientists to policy makers, to understand the system of scientific research better with its multiple dimensions.
Yes, I am also trying to apply systems thinking on an individual basis to my publication strategy, to avoid the dark and forgotten corners of academia where the very first paper might be going.
References
- Rigby J, Cox D, Julian K. Journal peer review: a bar or bridge? An analysis of a paper’s revision history and turnaround time, and the effect on citation. Scientometrics 2018, 114(3): 1087-1105.
- Deng B. Papers with shorter titles get more citations. Nature News 2015, 26.
- Adams J, Gurney KA. Bilateral and Multilateral Coauthorship and Citation Impact: Patterns in UK and US International Collaboration. Frontiers in Research Metrics and Analytics 2018, 3: 12.
- Van Noorden R. The Science That’s Never Been Cited. Nature. 2017:162-164.
- Cressey D. Sleeping beauty’papers slumber for decades. Nature 2015, 521: 394.
- Wagenknecht T. Unhelpful, caustic and slow: the academic community should rethink the way publications are reviewed. LSE Impact Blog; 2018.