“Knowledge can be public, yet undiscovered, if independently created fragments are logically related but never retrieved, brought together, and interpreted.”
— Don Swanson (1986), Undiscovered Public Knowledge
The goal of this National Science Foundation-funded project was to assess the ability of non-researchers to aid researchers in performing academic literature reviews. The goals were two-fold:
To see if crowdsourcing could be used as a way to complement and enhance researchers' literature search process.
To see if crowdsourced literature search could provide a form of apprenticeship for novices (e.g., undergraduates and high school students) to get involved in scientific research.
We have several publications that are currently under review or in progress. Stay tuned!
Doroudi, S. (2023). What is a related work? A typology of relationships in research literature. Synthese, 201(1), 24. https://link.springer.com/article/10.1007/s11229-022-03976-5
This is a philosophical paper addressing a fundamental question at the heart of literature search that often goes unspoken: "what is a related work?" The paper presents a taxonomy of different kinds of relationships that could exist between different academic works. When we search for literature, are we narrowly searching for articles that have overlapping content or are we searching for works that may have deeper structural analogies uncovering new ideas? And of relevance to this project, what kinds of related work could crowdsourcing help us uncover?
Researchers are welcome to use the following de-identified data from literature search tasks! Feel free to contact me for more detail on the data.
Dataset 1: List of articles found by over 600 crowdworkers and 15 researchers across five hypothetical research questions in different fields, along with four judges' evaluations of the quality of literature retrieved for one particular question.
Dataset 2: List of articles found by crowdworkers, undergraduate students, and elicit for four researchers' projects, along with the researchers' evaluation of the articles.
Dataset 3: Data from a field evaluation of our platform DiscoverTogether, including researcher's project descriptions, crowdworkers identified literature, and feedback from the researchers to the crowdworkers.