There are many definitions and conceptualizations of the umbrella term ‘open science’. For example, Fecher and Friesike (2014, p. 17) identified five Open Science schools of thought: “The infrastructure school (which is concerned with the technological architecture), the public school (which is concerned with the accessibility of knowledge creation), the measurement school (which is concerned with alternative impact measurement), the democratic school (which is concerned with access to knowledge) and the pragmatic school (which is concerned with collaborative research).” While all these might have relevance for SoLAR, the democratic school is particularly relevant given its focus on the “principal access to the products of research”. As argued by Fecher and Friesike (2014), when research artefacts (e.g., data) are openly available, other researchers are able to check and reproduce published findings, as well as fostering data mining and aggregation of artefacts from multiple data sets and papers, thereby enhancing generalizability and cross-validation across different contexts.
Open scholarship is a slightly broader concept and encourages researchers to share their knowledge and artefacts as early as possible in the research process with others (Burgelman et al., 2019). This approach of open scholarship is gaining momentum by The European Commission and “reflects the inclusion of the humanities in the equation as well as emphasizing the open input side to science in the form of open collaboration and active data and knowledge sharing” (Burgelman et al., 2019, p. 1).
Within SoLAR there could be a range of applications in open science and scholarship, including (in alphabetical order) sharing of algorithms, codes, coding schemes, data, experimental materials, model outputs (without disclosing the underlying data), surveys, synthetic datasets. By allowing SoLAR researchers to share (some or all of) their science and scholarship in an open manner there could be substantial opportunities for a) replication; b) generalization; c) robustness) and d) education of the SoLAR community at large (Brooks, 2021; Kuzilek et al., 2017; van der Zee & Reich, 2018). As argued by Nosek et al. (2015, p. 2) for science to progress it “needs both innovation and self-correction; replication offers opportunities for self-correction to more efficiently identify promising research directions.”
At the same time there could be a range of (potentially) legitimate concerns about open science and scholarship, including (in alphabetical order) a) bias towards forms of research that might be easier to facilitate open science and scholarship; b) equity and equality challenges amongst the practitioners of open science and scholarship; c) ethical implications with respect to human subject research; d) intellectual property and legal challenges; e) reputational damage; f) time consuming. As argued by Gehlbach and Robinson (2021) in educational disciplines that use mainly quantitative and standardized approaches it might be easier to facilitate open science and scholarship, but this might bias reported research to quantitative research, while mixed method research or qualitative research might be more difficult to share artefacts due to privacy, ethical, and sample size concerns. In terms of b) as highlighted by one of the few large publicly available educational datasets, OU Analyse (Kuzilek et al., 2017), its widespread use by other researchers has highlighted some unexpected and potentially negative implications. For example, substantial differences in regional progression rates were identified (Rizvi et al., 2019) related to social and economic conditions rather than to educational provision.
In terms of ethical challenges, as urged by Korir et al. (2020) when artefacts are extracted from multiple sources and formats, and in particular when combined with social network and/or discourse artefacts, even if they are “appropriately” anonymized it might still be possible for participants to identify their peer learners. Such issues may not be something which individual researchers can control themselves, as most scholarly institutions have research overseen by ethics or review boards and these boards may not allow disclosure of artifacts. If open science methods were implemented poorly, such actions could bias research reporting towards those who are at institutions which are more willing to release underlying artifacts instead of those who are engaging in the most meaningful research.
Several researchers and organizations consider their artifacts of research as intellectual property, and by making these publicly available they might face legal challenges. For example, one could imagine students who failed a degree suing a university for not reacting to analytics that signaled a risk, a concrete example of John Campbell (2007) 's early call that there may exist an obligation of knowing in learning analytics (Fritz & Whitmer, 2019). In addition, intellectual property is regularly monetized by scholars and their institutions, whether in the form of software, algorithms, or survey instruments, and open release may be a disincentive to some members of the community.
The reputational damage for academics who have made errors in data collection, cleaning, analyzing, or reporting of results could be extremely detrimental to their career, and fear of cancelation for mistakes (such as appearing in Retraction Watch) may paralyze researchers from sharing otherwise strong work with the SoLAR community. Finally, making research open to others may involve substantial costs in terms of time spent on ethics approval, participant consent, data cleaning, anonymization processes, and sharing of analyses or software.