Over the past few years, the attention to Open Science has increased. In academia, many researchers and universities embrace the idea of opening the black box of academic practices, becoming more collaborative, transparent, and reflective of their work [1,2]. Since the 2010s, these visions are actively shared and promoted by policy plans and government strategies. Framed as a transition from ‘Science 1.0’ to ‘Science 2.0’ [3,4], Open Science has grown to become
“an inclusive construct that combines various movements and practices aiming to make multilingual scientific knowledge openly available, accessible and reusable for everyone, to increase scientific collaborations and sharing of information for the benefits of science and society, and to open the processes of scientific knowledge creation, evaluation and communication to societal actors beyond the traditional scientific community” [5].
Despite the broadly shared enthusiasm for Open Science, its principles might not be the first and foremost thing academics consider when publishing their work. Instead, the publication strategy of the institute or research group of the academic is often prioritized. These mainly internally shared documents are meant to guide researchers within a certain research group in their publication process by advising in which journals to publish their work. When these strategies are set up, publication strategists consider different aspects, such as visibility of the research group in particular academic fields and the general scientific impact of publications [6]. Research metrics are usually prioritized over Open Science principles.
Research metrics
Many publication strategies are based on research metrics. Perceived as a measurement of the quality of a scientific journal, research metrics give an average of the number of citations per article published in that journal. Some research metrics try to normalize these numbers per research field. Whichever research metric is chosen, often, the metric is readily available: many journals proudly present their Journal Impact Factor, Eigenfactor score or Article Influence Score on their website or these can be found in the scores’ databases [7,8]. Following these scores make setting up a publication strategy relatively straightforward: only journals with a top 25 percent (or sometimes top 10 percent) score in the targeted scientific field are included. Decisions for the publication strategists are limited to the choices of the scientific field and research metric, after which the publication strategy is based on a quantitative measure only.
Open Access
One Open Science principle seems to be relatively unaffected by this way of setting up publication strategies, as it is often (unintentionally) included. Open access publishing, with its aim to make scientific publications openly available and accessible, is seen as the pillar of Open Science. Many publication strategies include journals that offer open access publishing (this can be (full) gold open access or (limited) green open access, see [9]). Although this could be an intentional strategy, as a growing number of public funds require open access publishing of works funded by them [10,11], it is mostly an unintentional result. A great number of journals with high metric scores are transitioning to offer open access publishing (albeit often with much higher costs).
All other open science principles
Other Open Science principles are, however, less ‘lucky’. Following the FAIR data principles, increased collaboration and considering societal impact are also crucial elements of open science. Journals with high research metric scores and open access publishing might lack support for these other characteristics. For example, publications in non-English languages or non-article publications (e.g. books and book chapters, presentations or documentaries, investigative journalism and policy advice) could represent a high level of collaboration and strong societal impact. These publications are currently marginalized, as they do not fit in with the cycle of scientific publishing, databases, citations and ranking, which the research metrics are based on. As such, in publication strategies, these publications are undervalued or simply ignored. This can be demotivating for academics who do value Open Science and feel inclined to follow its principles but are not rewarded for their efforts. In fact, deviation from the publication strategy diminishes the value of their publications in, for example, the yearly evaluations of the academic.
What next
Focusing less on a number of publications is part of the new recognition and reward structure many universities currently try to develop towards. When the number of publications does matter, however, it is good to remember that there is more to open science than open access publishing. Certain principles of open science, such as FAIR data and increased collaboration do not necessarily correspond with or are represented by high research metric scores. Therefore, publication strategies should not focus on research metrics. By centring publication strategies around these scores, researchers are limited in ways to match their scholarly output(s) with open science principles. Instead, publication strategies could consider adopting a broader perspective. A list of journals with high research metric scores could still be presented but in the context of explaining the intention of this score. Additionally, publication strategies could endorse researchers venturing away from such a list to pursue multiple open science principles. This would fit in with the current trend of enlarging the realm of science beyond a publication and citation industry and increasing the scope of recognition and rewarding.
Acknowledgements
Special thanks to Thed van Leeuwen, who provided great feedback for this blog.
References
[1] EUA, Open Science, European University Association. (2021). https://eua.eu/issues/21:open-science.html (accessed August 20, 2021).
[2] Universiteit Utrecht, Open Science, Universiteit Utrecht. (2021). https://www.uu.nl/en/research/open-science (accessed August 20, 2021).
[3] NWO, Open Science, NWO. (2021). https://www.nwo.nl/open-science (accessed August 20, 2021).
[4] Nationaal Programma Open Science, Open Science, Nationaal Programma Open Science. (2021). https://www.openscience.nl/ (accessed August 20, 2021).
[5] UNESCO, UNESCO Recommendation on Open Science, UNESCO. (2021). https://en.unesco.org/science-sustainable-future/open-science/recommendation (accessed August 19, 2021).
[6] S. Sismondo, Ghost-Managed Medicine: Big Pharma’s Invisible Hands, Mattering Press, Manchester, 2018. https://www.matteringpress.org/books/ghost-managed-medicine (accessed August 20, 2021).
[7] Eigenfactor, Eigenfactor: The Eigenfactor Metrics, Eigenfactor. (2021). http://www.eigenfactor.org/projects/journalRank/journalsearch.php (accessed August 20, 2021).
[8] Scimago Lab, SJR : Scientific Journal Rankings, SJR. (2021). https://www.scimagojr.com/journalrank.php (accessed August 20, 2021).
[9] Elsevier Author Services, Difference between Green and Gold Open Access, Elsevier Author Services - Articles. (2020). https://scientific-publishing.webshop.elsevier.com/publication-process/difference-between-green-gold-open-access/ (accessed August 20, 2021).
[10] Openaccess.nl, Plan S, Open Access. (2021). https://www.openaccess.nl/nl/in-nederland/plan-s (accessed April 21, 2021).
[11] European Commission, Open access - H2020 Online Manual, European Commission. (2020). https://ec.europa.eu/research/participants/docs/h2020-funding-guide/cross-cutting-issues/open-access-data-management/open-access_en.htm (accessed August 20, 2021).