Critical Metascience Articles

What is Critical Metascience?

Metascience is the science of science. Critical metascience takes a step back to question some common assumptions, approaches, problems, and solutions in metascience. Hence, it has also been described as meta-meta-science!

The following is a non-exhaustive collection of over 150 articles in this area. These articles address issues such as (a) metascience's focus on replication, statistics, and methods-based reforms; (b) the credibility and rigour of metascientific research; (c) inclusivity and diversity in open science; (d) the rationale for and implemention of preregistration and open data; (e) the importance of theory and theory development; and (f) the sociology of the science reform movement. These and other issues are considered from a variety of different perspectives including, for example, work by statisticians, psychologists, cognitive scientists, social scientists, ethnographers, sociologists, philosophers of science and, of course, metascientists!

Below, I've grouped articles by year and then, further down the page, by topic. If I've left out a key article, or there's new work you'd like added, please send me the reference.

Articles By Year

2024

Archer, R. (2024). Retiring Popper: Critical realism, falsificationism, and the crisis of replication. Theory & Psychology. https://doi.org/10.1177/09593543241250079 

Auspurg, K., & Brüderl, J. (2024). Toward a more credible assessment of the credibility of science by many-analyst studies. Proceedings of the National Academy of Sciences, 121(38), e2404035121. https://doi.org/10.1073/pnas.2404035121 

Bak-Coleman, J. B., & Devezer, B. (2024). Claims about scientific rigour require rigour. Nature Human Behavior. https://doi.org/10.1038/s41562-024-01982-w 

Bak-Coleman, J. B., Mann, R. P., Bergstrom, C. T., Gross, K., & West, J. (2024, June 26). The replication crisis is not a crisis of false positives. SocArXiv. https://doi.org/10.31235/osf.io/rkyf7

Burgos, J. E. (2024). Getting ontologically serious about the replication crisis in psychology. Journal of Theoretical and Philosophical Psychology. https://doi.org/10.1037/teo0000281 

Dames, H., Musfeld, P., Popov, V., Oberauer, K., & Frischkorn, G. T. (2024). Responsible research assessment should prioritize theory development and testing over ticking open science boxes. Meta-Psychology. https://doi.org/10.15626/MP.2023.3735 

Dudda, L., Kormann, E., Kozula, M., DeVito, N. J., Klebel, T., Dewi, A. P. M., … Leeflang, M. (2024, June 17). Open science interventions to improve reproducibility and replicability of research: A scoping review preprint. MetaArXiv. https://doi.org/10.31222/osf.io/a8rmu 

Erasmus, A. (2024). p-Hacking: Its costs and when it is warranted. Erkenn. https://doi.org/10.1007/s10670-024-00834-3 

Feest, U. (2024). What is the replication crisis a crisis of? Philosophy of Science. https://doi.org/10.1017/psa.2024.2 

Field, S. M., Volz, L., Kaznatcheev, A., Kaznatcheev, A., & van Dongen, N. (2024). Can a good theory be built using bad ingredients?. Computational Brain & Behavior. https://doi.org/10.1007/s42113-024-00220-w 

Guest, O. (2024). What makes a good theory, and how do we make a theory good? Computational Brain & Behavior. https://doi.org/10.1007/s42113-023-00193-2 

Holzmeister, F., Johannesson, M., Böhm, R., Dreber, A., Huber, J., & Kirchler, M. (2024). Heterogeneity in effect size estimates. Proceedings of the National Academy of Sciences, 121(32), e2403490121. https://doi.org/10.1073/pnas.2403490121 

Hostler, T. J. (2024). Open research reforms and the capitalist university: Areas of opposition and alignment. Collabra: Psychology, 10(1), 121383. https://doi.org/10.1525/collabra.121383 

Hostler, T. J. (2024). Research assessment using a narrow definition of “research quality” is an act of gatekeeping: A comment on Gärtner et al. (2022). Meta-Psychology, 8. https://doi.org/10.15626/MP.2023.3764 

Hutmacher, F., & Franz, D. J. (2024). Approaching psychology’s current crises by exploring the vagueness of psychological concepts: Recommendations for advancing the discipline. American Psychologist. https://doi.org/10.1037/amp0001300 

Iso-Ahola, S. E. (2024). Science of psychological phenomena and their testing. American Psychologist. https://doi.org/10.1037/amp0001362 

Jost, J. T. (2024). Grand challenge: Social psychology without hubris. Frontiers in Social Psychology, 1, Article 1283272. https://doi.org/10.3389/frsps.2023.1283272 

Khan, S., Hirsch, J. S., & Zubida, O. Z. (2024). A dataset without a code book: Ethnography and open science. Frontiers in Sociology, 9, Article 1308029. https://doi.org/10.3389/fsoc.2024.1308029 

Klonsky, E. D. (2024). Campbell’s law explains the replication crisis: Pre-registration badges are history repeating. Assessment. https://doi.org/10.1177/10731911241253430 

Klonsky, E. D. (2024). How to produce, identify, and motivate robust psychological science: A roadmap and a response to Vize et al. Assessment, 0(0). https://doi.org/10.1177/10731911241299723 

Lamb, D., Russell, A., Morant, N., & Stevenson, F. (2024). The challenges of open data sharing for qualitative researchers. Journal of Health Psychology. 29(7):659-664. https://doi.org/10.1177/13591053241237620 

Maziarz, M. (2024). Conflicting results and statistical malleability: Embracing pluralism of empirical results. Perspectives on Science, 32(6), 701-728. https://doi.org/10.1162/posc_a_00627 

Penders, B. (2024). Scandal in scientific reform: The breaking and remaking of science. Journal of Responsible Innovation, 11(1). https://doi.org/10.1080/23299460.2024.2371172 

Penders, B. (2024, November 26). Renovating the theatre of persuasion. ManyLabs as collaborative prototypes for the production of credible knowledge. PsyArXivhttps://doi.org/10.31222/osf.io/vhmk2 

Phaf, R. H. (2024). Positive deviance underlies successful science: Normative methodologies risk throwing out the baby with the bathwater. Review of General Psychology. https://doi.org/10.1177/10892680241235120 

Pownall, M. (2024). Is replication possible for qualitative research? A response to Makel et al. (2022). Educational Research and Evaluation, 29(1–2), 104–110.  https://doi.org/10.1080/13803611.2024.2314526 

Prosser, A. M., Bagnall, R., Higson-Sweeney, N. (2024). Reflection over compliance: Critiquing mandatory data sharing policies for qualitative research. Journal of Health Psychology, 62(4), 1635-1653. https://doi.org/10.1177/13591053231225903 

Prosser, A. M., Brown, O., Augustine, G., & Ellis, D. (2024). It’s time to join the conversation: Visions of the future for qualitative transparency and openness in management and organisation studies. SocArXiv. https://osf.io/preprints/socarxiv/ntf73 

Reyes Elizondo, A., & Kaltenbrunner, W. (2024). Navigating the science system: Research integrity and academic survival strategies. Science and Engineering Ethics, 30, Article 12. https://doi.org/10.1007/s11948-024-00467-3 

Rubin, M. (2024). Inconsistent multiple testing corrections: The fallacy of using family-based error rates to make inferences about individual hypotheses. Methods in Psychology, 10, Article 100140. https://doi.org/10.1016/j.metip.2024.100140 

Rubin, M. (2024). Preregistration does not improve the transparent evaluation of severity in Popper’s philosophy of science or when deviations are allowed. arXiv. https://doi.org/10.48550/arXiv.2408.12347

Rubin, M. (2024). Type I error rates are not usually inflated. Journal of Trial & Error. https://doi.org/10.36850/4d35-44bd 


Rubin, M., & Donkin, C. (2024). Exploratory hypothesis tests can be more compelling than confirmatory hypothesis tests. Philosophical Psychology, 37(8), 2019-2047. https://doi.org/10.1080/09515089.2022.2113771 

Souza-Neto, V., & Moyle, B. (2025). Preregistration is not a panacea, but why? A rejoinder to “infusing preregistration into tourism research”. Tourism Management, 107, Article 105061. https://doi.org/10.1016/j.tourman.2024.105061 

Ting. C., & Greenland, S. (2024). Forcing a deterministic frame on probabilistic phenomena: A communication blind spot in media coverage of the “replication crisis.” Science Communication, 46(5), 672-684. https://doi.org/10.1177/10755470241239947 

Ulpts, S. (2024). Responsible assessment of what research? Beware of epistemic diversity! Meta-Psychology. https://doi.org/10.15626/MP.2023.3797 

2023

Baumeister, R., Bushman, B., & Tice, D. (2023). Multi-site replications in social psychology: Reflections, implications, and future directions. The Spanish Journal of Psychology, 26, E3. https://doi.org/10.1017/SJP.2023.6

Buzbas, E. O., & Devezer, B. (2023). Tension between theory and practice of replication. Journal of Trial & Error. https://doi.org/10.36850/mr9 

Buzbas, E. O., Devezer, B., & Baumgaertner, B. (2023). The logical structure of experiments lays the foundation for a theory of reproducibility. Royal Society Open Science, 10(3). https://doi.org/10.1098/rsos.221042 

Dal Santo, T., Rice, D. B., Amiri, L. S., Tasleem, A., Li, K., Boruff, J. T., .Geoffroy, M.-C., Benedetti, A., & Thombs, B. D. (2023). Methods and results of studies on reporting guideline adherence are poorly reported: A meta-research study. Journal of Clinical Epidemiology. https://doi.org/10.1016/j.jclinepi.2023.05.017

Darda, K. M., Conry-Murray, C., Schmidt, K., Elsherif, M. M., Peverill, M., Yoneda, T., … Gernsbacher, M. (2023, October 29). Promoting civility in formal and informal open science contexts. PsyArXiv. https://doi.org/10.31234/osf.io/rfkyu 

Devezer, B., & Buzbas, E. O. (2023). Rigorous exploration in a model-centric science via epistemic iteration. Journal of Applied Research in Memory and Cognition, 12(2), 189–194. https://doi.org/10.1037/mac0000121 

Devezer, B., & Penders, B. (2023). Scientific reform, citation politics and the bureaucracy of oblivion. Quantitative Science Studies. https://doi.org/10.1162/qss_c_00274 

Hicks, D. J. (2023). Open science, the replication crisis, and environmental public health. Accountability in Research, 30(1), 34-62. https://doi.org/10.1080/08989621.2021.1962713 

Hostler, T. J. (2023). The invisible workload of open research. Journal of Trial & Error. https://doi.org/10.36850/mr5 

Jacobucci, R. (2022). A critique of using the labels confirmatory and exploratory in modern psychological research. Frontiers in Psychology, 13, Article 1020770. https://doi.org/10.3389/fpsyg.2022.1020770 

Lavelle, J. S. (2023, October 2). Growth from uncertainty: Understanding the replication 'crisis' in infant psychology. PhilSci Archive. https://philsci-archive.pitt.edu/22679/ 

Leonelli, S. (2023). Philosophy of open science. Cambridge University Press. https://www.cambridge.org/core/elements/philosophy-of-open-science/0D049ECF635F3B676C03C6868873E406 

Liu, M. (2023). Whose open science are we talking about? From open science in psychology to open science in applied linguistics. Language Teaching, 1-8. https://doi.org/10.1017/S0261444823000307 

Peterson, D., & Panofsky, A. (2023). Metascience as a scientific social movement. Minerva. https://doi.org/10.1007/s11024-023-09490-3

Prosser, A. M. B., Hamshaw, R., Meyer, J., Bagnall, R., Blackwood, L., Huysamen, M., ... & Walter, Z. (2023). When open data closes the door: Problematising a one size fits all approach to open data in journal submission guidelines. British Journal of Social Psychology, 62(4), 1635-1653. https://doi.org/10.1111/bjso.12576 

Rubin, M. (2023). Questionable metascience practices. Journal of Trial & Error, 4(1), 5–20. https://doi.org/10.36850/mr4

Rubin, M. (2023). The replication crisis is less of a “crisis” in Lakatos' philosophy of science. MetaArXiv. https://doi.org/10.31222/osf.io/2dz9s

Schwartz, B. (2023, March 20). Psychology’s increased rigor is good news. But is it only good news? Behavioural Scientist. https://behavioralscientist.org/psychologys-increased-rigor-is-good-news-but-is-it-only-good-news/  

Schimmelpfennig, R., Spicer, R., White, C., Gervais, W. M., Norenzayan, A., Heine, S., … Muthukrishna, M. (2023, February 9). A problem in theory and more: Measuring the moderating role of culture in Many Labs 2. PsyArxiv. https://psyarxiv.com/hmnrx/

Steltenpohl, C. N., Lustick, H., Meyer, M. S., Lee, L. E., Stegenga, S. M., Reyes, L. S., & Renbarger, R. L. (2023). Rethinking transparency and rigor from a qualitative open science perspective. Journal of Trial & Error, 4(1). https://doi.org/10.36850/mr7 

Syed, M. (2023, December 8). Some data indicating that editors and reviewers do not check preregistrations during the review process. PsyArXiv. https://doi.org/10.31234/osf.io/nh7qw 

Syrjänen, P. (2023). Novel prediction and the problem of low-quality accommodation. Synthese, 202, Article 182, 1-32. https://doi.org/10.1007/s11229-023-04400-2  

van Drimmelen, T., Slagboom, N., Reis, R., Bouter, L., & van der Steen, J. T. (2023, August 7). Decisions, decisions, decisions: An ethnographic study of researcher discretion in practice. PsyArXiv. https://osf.io/preprints/metaarxiv/7dh3t/ 

Wilson, B. M., & Wixted, J. T. (2023). On the importance of modeling the invisible world of underlying effect sizes. Social Psychological Bulletin, 18, 1-16. https://doi.org/10.32872/spb.9981 

2022


Baumeister, R. F., Tice, D. M., & Bushman, B. J. (2022). A review of multisite replication projects in social psychology: Is it viable to sustain any confidence in social psychology’s knowledge base? Perspectives on Psychological Science, 18(4), 912-935. https://doi.org/10.1177/17456916221121815 


Bazzoli, A. (2022). Open science and epistemic pluralism: A tale of many perils and some opportunities. Industrial and Organizational Psychology, 15(4), 525-528. https://doi.org/10.1017/iop.2022.67 

Berberi, I., & Roche, D. G. (2022). No evidence that mandatory open data policies increase error correction. Nature Ecology & Evolution, 6(11), 1630-1633. https://doi.org/10.1038/s41559-022-01879-9


Derksen, M., & Field, S. (2022). The tone debate: Knowledge, self, and social order. Review of General Psychology, 26(2), 172-183. https://doi.org/10.1177/10892680211015636


Derksen, M., & Morawski, J. (2022). Kinds of replication: Examining the meanings of “conceptual replication” and “direct replication”. Perspectives on Psychological Science, 17(5), 1490-1505. https://doi.org/10.1177/17456916211041116


Devezer, B., & Buzbas, E. (2022, November 25). Minimum viable experiment to replicate. PhilSci Archive. http://philsci-archive.pitt.edu/id/eprint/21475 


Flis, I. (2022). The function of literature in psychological science. Review of General Psychology, 26(2), 146-156. https://doi.org/10.1177/10892680211066466


Fox Tree, J., Lleras, A., Thomas, A., & Watson, D. (2022, August 30). The inequitable burden of open science. Psychonomic Society Featured Content. https://featuredcontent.psychonomic.org/the-inequitable-burden-of-open-science/


Gollwitzer, M., & Schwabe, J. (2022). Context dependency as a predictor of replicability. Review of General Psychology, 26(2), 241-249. https://doi.org/10.1177/10892680211015635 


Guzzo, R. A., Schneider, B., & Nalbantian, H. R. (2022). Open science, closed doors: The perils and potential of open science for research in practice. Industrial and Organizational Psychology: Perspectives on Science and Practice, 15, 495–515. https://doi.org/10.1017/iop.2022.61 


Haig, B. D. (2022). Understanding replication in a way that is true to science. Review of General Psychology, 26(2), 224-240. https://doi.org/10.1177/10892680211046514


Khalil, A. T., Shinwari, Z. K., & Islam, A. (2022). Fostering openness in open science: An ethical discussion of risks and benefits. Frontiers in Political Science, 4, 930574. https://doi.org/10.3389/fpos.2022.930574 


Lash, T. L. (2022). Getting over TOP. Epidemiology, 33(1), 1-6. https://doi.org/10.1097/EDE.0000000000001424 


Leonelli, S. (2022). Open science and epistemic diversity: Friends or foes? Philosophy of Science, 89(5), 991-1001. https://doi.org/10.1017/psa.2022.45 


Liu, B., & Wei, L. (2022). Unintended effects of open data policy in online behavioral research: An experimental investigation of participants’ privacy concerns and research validity. Computers in Human Behavior, Article 107537. https://doi.org/10.1016/j.chb.2022.107537 


Lohmann, A., Astivia, O. L., Morris, T. P., & Groenwold, R. H. (2022). It's time! Ten reasons to start replicating simulation studies. Frontiers in Epidemiology, 2, 973470. https://doi.org/10.3389/fepid.2022.973470 


Malich, L., & Rehmann-Sutter, C. (2022). Metascience is not enough - A plea for psychological humanities in the wake of the replication crisis. Review of General Psychology, 26(2), 261-273. https://doi.org/10.1177/10892680221083876


McDermott, R. (2022). Breaking free: How preregistration hurts scholars and science. Politics and the Life Sciences, 41(1), 55-59.  https://doi.org/10.1017/pls.2022.4 


Morawski, J. (2022). How to true psychology’s objects. Review of General Psychology, 26(2), 157-171. https://doi.org/10.1177/10892680211046518


Penders, B. (2022). Process and bureaucracy: Scientific reform as civilisation. Bulletin of Science, Technology & Society, 42(4), 107-116. https://doi.org/10.1177/02704676221126388 


Pownall, M., & Hoerst, C. (2022). Slow science in scholarly critique. The Psychologist, 35, 2. https://thepsychologist.bps.org.uk/volume-35/february-2022/slow-science-scholarly-critique


Rubin, M. (2022). The costs of HARKing. British Journal for the Philosophy of Science, 73(2), 535-560. https://doi.org/10.1093/bjps/axz050


Wegener, D. T., Fabrigar, L. R., Pek, J., & Hoisington-Shaw, K. (2022). Evaluating research in personality and social psychology: Considerations of statistical power and concerns about false findings. Personality and Social Psychology Bulletin, 48(7), 1105-1117. https://doi.org/10.1177/01461672211030811 

2021

Anonymous. (2021, November 25). It’s 2021… and we are still dealing with misogyny in the name of open science. University of Sussex School of Psychology Blog. https://blogs.sussex.ac.uk/psychology/2021/11/25/its-2021-and-we-are-still-dealing-with-misogyny-in-the-name-of-open-science/


Bastian, H. (2021, October 31). The metascience movement needs to be more self-critical. PLOS Blogs: Absolutely Maybe. https://absolutelymaybe.plos.org/2021/10/31/the-metascience-movement-needs-to-be-more-self-critical/


Bennett, E. A. (2021). Open science from a qualitative, feminist perspective: Epistemological dogmas and a call for critical examination. Psychology of Women Quarterly, 45(4), 448-456. https://doi.org/10.1177/03616843211036460


Bryan, C. J., Tipton, E., & Yeager, D. S. (2021). Behavioural science is unlikely to change the world without a heterogeneity revolution. Nature Human Behaviour, 5(8), 980-989. https://doi.org/10.1038/s41562-021-01143-3 


Devezer, B., Navarro, D. J., Vandekerckhove, J., & Ozge Buzbas, E. (2021). The case for formal methodology in scientific reform. Royal Society Open Science, 8(3), Article 200805. https://doi.org/10.1098/rsos.200805


Field, S. M., & Derksen, M. (2021). Experimenter as automaton; experimenter as human: Exploring the position of the researcher in scientific research. European Journal for Philosophy of Science, 11, Article 11. https://doi.org/10.1007/s13194-020-00324-7


Gervais, W. M. (2021). Practical methodological reform needs good theory. Perspectives on Psychological Science, 16(4), 827-843. https://doi.org/10.1177/1745691620977471


Guest, O., & Martin, A. E. (2021). How computational modeling can force theory building in psychological science. Perspectives on Psychological Science, 16(4), 789–802. https://doi.org/10.1177/1745691620970585 


Jacobs, A., Büthe, T., Arjona, A., Arriola, L., Bellin, E., Bennett, A.,...Yashar, D. (2021). The qualitative transparency deliberations: Insights and implications. Perspectives on Politics, 19(1), 171-208. https://doi.org/10.1017/S1537592720001164 


Kessler, A., Likely, R., & Rosenberg, J. M. (2021). Open for whom? The need to define open science for science education. Journal of Research in Science Teaching, 58(10), 1590-1595. https://doi.org/10.1002/tea.21730


Peterson, D., & Panofsky, A. (2021). Arguments against efficiency in science. Social Science Information, 60(3), 350-355. https://doi.org/10.1177/05390184211021383


Pham, M. T., & Oh, T. T. (2021). Preregistration is neither sufficient nor necessary for good science. Journal of Consumer Psychology, 31(1), 163-176. https://doi.org/10.1002/jcpy.1209


Pham, M. T., & Oh, T. T. (2021). On not confusing the tree of trustworthy statistics with the greater forest of good science: A comment on Simmons et al.’s perspective on pre‐registration. Journal of Consumer Psychology, 31(1), 181-185. https://doi.org/10.1002/jcpy.1213 


Proulx, T., & Morey, R. D. (2021). Beyond statistical ritual: Theory in psychological science. Perspectives on Psychological Science, 16(4), 671-681. https://doi.org/10.1177/17456916211017098


Rubin, M. (2021). What type of Type I error? Contrasting the Neyman-Pearson and Fisherian approaches in the context of exact and direct replications. Synthese, 198, 5809–5834. https://doi.org/10.1007/s11229-019-02433-0  


Szollosi, A., & Donkin, C. (2021). Arrested theory development: The misguided distinction between exploratory and confirmatory research. Perspectives on Psychological Science, 16(4), 717-724. https://doi.org/10.1177/1745691620966796


Wentzel, K. R. (2021). Open science reforms: Strengths, challenges, and future directions. Educational Psychologist, 56(2), 161-173. https://doi.org/10.1080/00461520.2021.1901709

2020

Andreoletti, M. (2020). Replicability crisis and scientific reforms: Overlooked issues and unmet challenges. International Studies in the Philosophy of Science, 33(3), 135-151. https://doi.org/10.1080/02698595.2021.1943292

Bird, A. (2020). Understanding the replication crisis as a base rate fallacy. The British Journal for the Philosophy of Science, 72(4), 965-993. https://doi.org/10.1093/bjps/axy051 


Guttinger, S. (2020). The limits of replicability. European Journal for Philosophy of Science, 10(2), 1-17. https://doi.org/10.1007/s13194-019-0269-1


Iso-Ahola, S. E. (2020). Replication and the establishment of scientific truth. Frontiers in Psychology, 11, Article 2183. https://doi.org/10.3389/fpsyg.2020.02183 


Lewandowsky, S., & Oberauer, K. (2020). Low replicability can support robust and efficient science. Nature Communications, 11, Article 358. https://doi.org/10.1038/s41467-019-14203-0


Navarro, D. (2020, September 23). Paths in strange spaces: A comment on preregistration. PsyArXiv. https://doi.org/10.31234/osf.io/wxn58


Pratt, M. G., Kaplan, S., & Whittington, R. (2020). The Tumult over transparency: Decoupling transparency from replication in establishing trustworthy qualitative research. Administrative Science Quarterly, 65(1), 1-19. https://doi.org/10.1177/0001839219887663 


Rubin, M. (2020). Does preregistration improve the credibility of research findings? The Quantitative Methods for Psychology, 16(4), 376–390. https://doi.org/10.20982/tqmp.16.4.p376


Szollosi, A., Kellen, D., Navarro, D. J., Shiffrin, R., van Rooij, I., Van Zandt, T., & Donkin, C. (2020). Is preregistration worthwhile? Trends in Cognitive Science, 24(2), 94-95. https://doi.org/10.1016/j.tics.2019.11.009


Ulrich, R., & Miller, J. (2020). Meta-research: Questionable research practices may have little effect on replicability. ELife, 9, Article e58237. https://doi.org/10.7554/eLife.58237


Whitaker, K., & Guest, O. (2020). # bropenscience is broken science. The Psychologist, 33, 34-37. https://thepsychologist.bps.org.uk/volume-33/november-2020/bropenscience-broken-science

2019

Bahlai, C., Bartlett, L. J., Burgio, K. R., Fournier, A. M., Keiser, C. N., Poisot, T., & Whitney, K. S. (2019). Open science isn’t always open to all scientists. American Scientist, 107(2), 78-82. http://dx.doi.org/10.1511/2019.107.2.78 


Bryan, C. J., Yeager, D. S., & O’Brien, J. M. (2019). Replicator degrees of freedom allow publication of misleading failures to replicate. Proceedings of the National Academy of Sciences, 116(51), 25535-25545. https://doi.org/10.1073/pnas.1910951116 


Derksen, M. (2019). Putting Popper to work. Theory & Psychology, 29(4), 449-465. https://doi.org/10.1177/0959354319838343


Devezer, B., Nardin, L. G., Baumgaertner, B., & Buzbas, E. O. (2019). Scientific discovery in a model-centric framework: Reproducibility, innovation, and epistemic diversity. PloS one, 14(5), Article e0216125. https://doi.org/10.1371/journal.pone.0216125


Drummond, C. (2019). Is the drive for reproducible science having a detrimental effect on what is published? Learned Publishing, 32(1), 63-69. https://doi.org/10.1002/leap.1224


Feest, U. (2019). Why replication is overrated. Philosophy of Science, 86(5), 895-905. https://doi.org/10.1086/705451


Flis, I. (2019). Psychologists psychologizing scientific psychology: An epistemological reading of the replication crisis. Theory & Psychology, 29(2), 158-181. https://doi.org/10.1177/0959354319835322


Lewandowsky, S. (2019, January 22). Avoiding Nimitz Hill with more than a Little Red Book: Summing up #PSprereg. Psychonomic Society. https://featuredcontent.psychonomic.org/avoiding-nimitz-hill-with-more-than-a-little-red-book-summing-up-psprereg/


MacEachern, Steven N., & Van Zandt, T. (2019). Preregistration of modeling exercises may not be useful. Computational Brain & Behavior, 2, 179-182. https://doi.org/10.1007/s42113-019-00038-x 


Morawski, J. (2019). The replication crisis: How might philosophy and theory of psychology be of use? Journal of Theoretical and Philosophical Psychology, 39(4), 218–238. https://doi.org/10.1037/teo0000129


Morey, R. (2019). You must tug that thread: Why treating preregistration as a gold standard might incentivize poor behavior. Psychonomic Society. https://featuredcontent.psychonomic.org/you-must-tug-that-thread-why-treating-preregistration-as-a-gold-standard-might-incentivize-poor-behavior/


Oberauer, K. (2019, January 15). Preregistration of a forking path – What does it add to the garden of evidence? Psychonomic Society. https://featuredcontent.psychonomic.org/preregistration-of-a-forking-path-what-does-it-add-to-the-garden-of-evidence/


Oberauer, K., & Lewandowsky, S. (2019). Addressing the theory crisis in psychology. Psychonomic Bulletin & Review, 26(5), 1596-1618. https://doi.org/10.3758/s13423-019-01645-2


Penders, B., Holbrook, J. B., & de Rijcke, S. (2019). Rinse and repeat: Understanding the value of replication across different ways of knowing. Publications, 7(3), 52. https://doi.org/10.3390/publications7030052 


Shiffrin, R. (2019). Complexity of science v. #PSprereg? Psychonomic Society. https://featuredcontent.psychonomic.org/complexity-of-science-v-psprereg/


Sacco, D. F., Brown, M., & Bruton, S. V. (2019). Grounds for ambiguity: Justifiable bases for engaging in questionable research practices. Science and Engineering Ethics, 25(5), 1321-1337. https://doi.org/10.1007/s11948-018-0065-x


Stroebe, W. (2019). What can we learn from many labs replications? Basic and Applied Social Psychology, 41(2), 91-103. https://doi.org/10.1080/01973533.2019.1577736 


van Rooij, I. (2019). Psychological science needs theory development before preregistration. Psychonomic Society. https://featuredcontent.psychonomic.org/psychological-science-needs-theory-development-before-preregistration/  


Wiggins, B. J., & Christopherson, C. D. (2019). The replication crisis in psychology: An overview for theoretical and philosophical psychology. Journal of Theoretical and Philosophical Psychology, 39(4), 202–217. https://doi.org/10.1037/teo0000137


Wood, W., & Wilson, T. D. (2019, August 22). No crisis but no time for complacency. APS Observer, 32(7). https://www.psychologicalscience.org/observer/no-crisis-but-no-time-for-complacency

2018

Fanelli, D. (2018). Opinion: Is science really facing a reproducibility crisis, and do we need it to? Proceedings of the National Academy of Sciences, 115(11), 2628-2631. https://doi.org/10.1073/pnas.1708272114


Fiedler, K. (2018). The creative cycle and the growth of psychological science. Perspectives on Psychological Science, 13(4), 433-438. https://doi.org/10.1177/1745691617745651 


Gigerenzer, G. (2018). Statistical rituals: The replication delusion and how we got there. Advances in Methods and Practices in Psychological Science, 1(2), 198-218. https://doi.org/10.1177/2515245918771329 


Ledgerwood, A. (2018). The preregistration revolution needs to distinguish between predictions and analyses. Proceedings of the National Academy of Sciences, 115(45), E10516-E10517. https://doi.org/10.1073/pnas.1812592115 


Leonelli, S. (2018). Rethinking reproducibility as a criterion for research quality. Including a symposium on Mary Morgan: Curiosity, imagination, and surprise. Research in the History of Economic Thought and Methodology, 36B (pp. 129-146). Emerald Publishing. https://doi.org/10.1108/S0743-41542018000036B009


Mirowski, P. (2018). The future(s) of open science. Social Studies of Science, 48(2), 171-203. https://doi.org/10.1177/0306312718772086 


Redish, D. A., Kummerfeld, E., Morris, R. L., & Love, A. C. (2018). Reproducibility failures are essential to scientific inquiry. Proceedings of the National Academy of Sciences, 115, 5042-5046. https://doi.org/10.1073/pnas.1806370115  


Vancouver, J. N. (2018). In defense of HARKing. Industrial and Organizational Psychology. 11(1), 73–80. https://doi.org/10.1017/iop.2017.89


Wilson, B. M., & Wixted, J. T. (2018). The prior odds of testing a true effect in cognitive and social psychology. Advances in Methods and Practices in Psychological Science, 1(2), 186–197. https://doi.org/10.1177/2515245918767122 

2017

Finkel, E. J., Eastwick, P. W., & Reis, H. T. (2017). Replicability and other features of a high-quality science: Toward a balanced and empirical approach. Journal of Personality and Social Psychology, 113(2), 244-253. http://dx.doi.org/10.1037/pspi0000075 


Greenfield, P. M. (2017). Cultural change over time: Why replicability should not be the gold standard in psychological science. Perspectives on Psychological Science, 12(5), 762-771. https://doi.org/10.1177/1745691617707314


Hamlin, J. K. (2017). Is psychology moving in the right direction? An analysis of the evidentiary value movement. Perspectives on Psychological Science, 12(4), 690-693. https://doi.org/10.1177/1745691616689062


Hollenbeck, J. R., & Wright, P. M. (2017). Harking, sharking, and tharking: Making the case for post hoc analysis of scientific data. Journal of Management, 43(1), 5-8. https://doi.org/10.1177/0149206316679487


Iso-Ahola, S. E. (2017). Reproducibility in psychological science: When do psychological phenomena exist? Frontiers in Psychology, 8, 879. https://doi.org/10.3389/fpsyg.2017.00879 


Levin, N., & Leonelli, S. (2017). How does one “open” science? Questions of value in biological research. Science, Technology, & Human Values, 42(2), 280-305. https://doi.org/10.1177/0162243916672071


Rubin, M. (2017). An evaluation of four solutions to the forking paths problem: Adjusted alpha, preregistration, sensitivity analyses, and abandoning the Neyman-Pearson approach. Review of General Psychology, 21(4), 321-329. https://doi.org/10.1037/gpr0000135  


Rubin, M. (2017). Do p values lose their meaning in exploratory analyses? It depends how you define the familywise error rate. Review of General Psychology, 21(3), 269-275. https://doi.org/10.1037/gpr0000123  


Rubin, M. (2017). When does HARKing hurt? Identifying when different types of undisclosed post hoc hypothesizing harm scientific progress. Review of General Psychology, 21(4), 308-320. https://doi.org/10.1037/gpr0000128

2016

Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93-99. https://doi.org/10.1016/j.jesp.2015.10.002 


Fiedler, K., & Schwarz, N. (2016). Questionable research practices revisited. Social Psychological and Personality Science, 7(1), 45-52. https://doi.org/10.1177/1948550615612150


Firestein, S. (2016, February 14). Why failure to replicate findings can actually be good for science. LA Times. https://www.latimes.com/opinion/op-ed/la-oe-0214-firestein-science-replication-failure-20160214-story.html


Fiske, S. T. (2016, October 31). A call to change science’s culture of shaming. APS Observer, 29. https://www.psychologicalscience.org/observer/a-call-to-change-sciences-culture-of-shaming


Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016). Comment on “Estimating the reproducibility of psychological science”. Science, 351(6277), 1037-1037. https://doi.org/10.1126/science.aad7243  


Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016). More on “Estimating the reproducibility of psychological science'”. https://gking.harvard.edu/files/gking/files/gkpw_post_publication_response.pdf 


Guest, O. (2016). Crisis in what exactly? The Winnower. https://doi.org/10.15200/winn.146590.01538 


Phaf, R. H. (2016). Replication requires psychological rather than statistical hypotheses: The case of eye movements enhancing word recollection. Frontiers in Psychology 7. https://doi.org/10.3389/fpsyg.2016.02023 


Schaller, M. (2016). The empirical benefits of conceptual rigor: Systematic articulation of conceptual hypotheses can reduce the risk of non-replicable results (and facilitate novel discoveries too). Journal of Experimental Social Psychology, 66, 107-115. https://doi.org/10.1016/j.jesp.2015.09.006 


Trafimow, D., & Earp, B. D. (2016). Badly specified theories are not responsible for the replication crisis in social psychology: Comment on Klein. Theory & Psychology, 26(4), 540–548. https://doi.org/10.1177/0959354316637136 


Tsai, A. C., Kohrt, B. A., Matthews, L. T., Betancourt, T. S., Lee, J. K., Papachristos, A. V., ... & Dworkin, S. L. (2016). Promises and pitfalls of data sharing in qualitative research. Social Science & Medicine, 169, 191-198. https://doi.org/10.1016/j.socscimed.2016.08.004 

2015

Barrett, L. F. (2015, September 1). Psychology is not in crisis. The New York Times. https://www3.nd.edu/~ghaeffel/ScienceWorks.pdf


Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70(6), 487–498. https://doi.org/10.1037/a0039400

2014

Stanley, D. J., & Spence, J. R. (2014). Expectations for replications: Are yours realistic? Perspectives on Psychological Science, 9(3), 305-318. https://doi.org/10.1177/1745691614528518


Stroebe, W., & Strack, F. (2014). The alleged crisis and the illusion of exact replication. Perspectives on Psychological Science, 9(1), 59-71. https://doi.org/10.1177/1745691613514450

2013

Scott, S. (2013, July 25). Pre-registration would put science in chains. Times Higher Education. https://www.timeshighereducation.com/comment/opinion/pre-registration-would-put-science-in-chains/2005954.article 

2012

Fiedler, K., Kutzner, F., & Krueger, J. I. (2012). The long way from α-error control to validity proper: Problems with a short-sighted false-positive debate. Perspectives on Psychological Science, 7(6), 661-669. https://doi.org/10.1177/1745691612462587 


Lash, T. L., & Vandenbroucke, J. P. (2012). Commentary: Should preregistration of epidemiologic study protocols become compulsory?: Reflections and a counterproposal. Epidemiology, 23(2), 184-188. https://doi.org/10.1097/EDE.0b013e318245c05b 

Articles By Topic

Replication & Reproducibility

Burgos, J. E. (2024). Getting ontologically serious about the replication crisis in psychology. Journal of Theoretical and Philosophical Psychology. https://doi.org/10.1037/teo0000281 

Ting. C., & Greenland, S. (2024). Forcing a deterministic frame on probabilistic phenomena: A communication blind spot in media coverage of the “replication crisis.” Science Communication, 46(5), 672-684. https://doi.org/10.1177/10755470241239947 

Baumeister, R., Bushman, B., & Tice, D. (2023). Multi-site replications in social psychology: Reflections, implications, and future directions. The Spanish Journal of Psychology, 26, E3. https://doi.org/10.1017/SJP.2023.6

Buzbas, E. O., & Devezer, B. (2023). Tension between theory and practice of replication. Journal of Trial & Error. https://doi.org/10.36850/mr9 

Buzbas, E. O., Devezer, B., & Baumgaertner, B. (2023). The logical structure of experiments lays the foundation for a theory of reproducibility. Royal Society Open Science, 10(3). https://doi.org/10.1098/rsos.221042 

Feest, U. (2024). What is the replication crisis a crisis of? Philosophy of Science. https://doi.org/10.1017/psa.2024.2 

Rubin, M. (2023). The replication crisis is less of a “crisis” in Lakatos' philosophy of science. MetaArXiv. https://doi.org/10.31222/osf.io/2dz9s

Wilson, B. M., & Wixted, J. T. (2023). On the importance of modeling the invisible world of underlying effect sizes. Social Psychological Bulletin, 18, 1-16. https://doi.org/10.32872/spb.9981 

Derksen, M., & Morawski, J. (2022). Kinds of replication: Examining the meanings of “conceptual replication” and “direct replication”. Perspectives on Psychological Science, 17(5), 1490-1505. https://doi.org/10.1177/17456916211041116

Devezer, B., & Buzbas, E. (2022, November 25). Minimum viable experiment to replicate. PhilSci Archive. http://philsci-archive.pitt.edu/id/eprint/21475 

Gollwitzer, M., & Schwabe, J. (2022). Context dependency as a predictor of replicability. Review of General Psychology, 26(2), 241-249. https://doi.org/10.1177/10892680211015635 

Haig, B. D. (2022). Understanding replication in a way that is true to science. Review of General Psychology, 26(2), 224-240. https://doi.org/10.1177/10892680211046514

Rubin, M. (2021). What type of Type I error? Contrasting the Neyman-Pearson and Fisherian approaches in the context of exact and direct replications. Synthese, 198, 5809–5834. https://doi.org/10.1007/s11229-019-02433-0 

Wegener, D. T., Fabrigar, L. R., Pek, J., & Hoisington-Shaw, K. (2022). Evaluating research in personality and social psychology: Considerations of statistical power and concerns about false findings. Personality and Social Psychology Bulletin, 48(7), 1105-1117. https://doi.org/10.1177/01461672211030811

Bird, A. (2020). Understanding the replication crisis as a base rate fallacy. The British Journal for the Philosophy of Science, 72(4), 965-993. https://doi.org/10.1093/bjps/axy051 

Iso-Ahola, S. E. (2020). Replication and the establishment of scientific truth. Frontiers in Psychology, 11, Article 2183. https://doi.org/10.3389/fpsyg.2020.02183 

Lewandowsky, S., & Oberauer, K. (2020). Low replicability can support robust and efficient science. Nature Communications, 11, Article 358. https://doi.org/10.1038/s41467-019-14203-0


Feest, U. (2019). Why replication is overrated. Philosophy of Science, 86(5), 895-905. https://doi.org/10.1086/705451


Morawski, J. (2019). The replication crisis: How might philosophy and theory of psychology be of use? Journal of Theoretical and Philosophical Psychology, 39(4), 218–238. https://doi.org/10.1037/teo0000129


Wood, W., & Wilson, T. D. (2019, August 22). No crisis but no time for complacency. APS Observer, 32(7). https://www.psychologicalscience.org/observer/no-crisis-but-no-time-for-complacency

Fanelli, D. (2018). Opinion: Is science really facing a reproducibility crisis, and do we need it to? Proceedings of the National Academy of Sciences, 115(11), 2628-2631. https://doi.org/10.1073/pnas.1708272114


Gigerenzer, G. (2018). Statistical rituals: The replication delusion and how we got there. Advances in Methods and Practices in Psychological Science, 1(2), 198-218. https://doi.org/10.1177/2515245918771329 


Redish, D. A., Kummerfeld, E., Morris, R. L., & Love, A. C. (2018). Reproducibility failures are essential to scientific inquiry. Proceedings of the National Academy of Sciences, 115, 5042-5046. https://doi.org/10.1073/pnas.1806370115 


Iso-Ahola, S. E. (2017). Reproducibility in psychological science: When do psychological phenomena exist? Frontiers in Psychology, 8, 879. https://doi.org/10.3389/fpsyg.2017.00879 


Crandall, C. S., & Sherman, J. W. (2016). On the scientific superiority of conceptual replications for scientific progress. Journal of Experimental Social Psychology, 66, 93-99. https://doi.org/10.1016/j.jesp.2015.10.002 


Firestein, S. (2016, February 14). Why failure to replicate findings can actually be good for science. LA Times. https://www.latimes.com/opinion/op-ed/la-oe-0214-firestein-science-replication-failure-20160214-story.html


Phaf, R. H. (2016). Replication requires psychological rather than statistical hypotheses: The case of eye movements enhancing word recollection. Frontiers in Psychology 7. https://doi.org/10.3389/fpsyg.2016.02023 


Barrett, L. F. (2015, September 1). Psychology is not in crisis. The New York Times. https://www3.nd.edu/~ghaeffel/ScienceWorks.pdf

Maxwell, S. E., Lau, M. Y., & Howard, G. S. (2015). Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? American Psychologist, 70(6), 487–498. https://doi.org/10.1037/a0039400


Stanley, D. J., & Spence, J. R. (2014). Expectations for replications: Are yours realistic? Perspectives on Psychological Science, 9(3), 305-318. https://doi.org/10.1177/1745691614528518


Stroebe, W., & Strack, F. (2014). The alleged crisis and the illusion of exact replication. Perspectives on Psychological Science, 9(1), 59-71. https://doi.org/10.1177/1745691613514450

False Positives & Questionable Research Practices

Bak-Coleman, J. B., Mann, R. P., Bergstrom, C. T., Gross, K., & West, J. (2024, June 26). The replication crisis is not a crisis of false positives. SocArXiv. https://doi.org/10.31235/osf.io/rkyf7

Erasmus, A. (2024). p-Hacking: Its costs and when it is warranted. Erkenn. https://doi.org/10.1007/s10670-024-00834-3 

Reyes Elizondo, A., & Kaltenbrunner, W. (2024). Navigating the science system: Research integrity and academic survival strategies. Science and Engineering Ethics, 30, Article 12. https://doi.org/10.1007/s11948-024-00467-3 

Rubin, M. (2024). Inconsistent multiple testing corrections: The fallacy of using family-based error rates to make inferences about individual hypotheses. Methods in Psychology, 10, Article 100140. https://doi.org/10.1016/j.metip.2024.100140 

Rubin, M. (2024). Type I error rates are not usually inflated. Journal of Trial & Error. https://doi.org/10.36850/4d35-44bd 

Syrjänen, P. (2023). Novel prediction and the problem of low-quality accommodation. Synthese, 202, Article 182, 1-32. https://doi.org/10.1007/s11229-023-04400-2

van Drimmelen, T., Slagboom, N., Reis, R., Bouter, L., & van der Steen, J. T. (2023, August 7). Decisions, decisions, decisions: An ethnographic study of researcher discretion in practice. PsyArXiv. https://osf.io/preprints/metaarxiv/7dh3t/ 

Rubin, M. (2022). The costs of HARKing. British Journal for the Philosophy of Science, 73(2), 535-560. https://doi.org/10.1093/bjps/axz050

Ulrich, R., & Miller, J. (2020). Meta-research: Questionable research practices may have little effect on replicability. ELife, 9, Article e58237. https://doi.org/10.7554/eLife.58237


Sacco, D. F., Brown, M., & Bruton, S. V. (2019). Grounds for ambiguity: Justifiable bases for engaging in questionable research practices. Science and Engineering Ethics, 25(5), 1321-1337. https://doi.org/10.1007/s11948-018-0065-x


Vancouver, J. N. (2018). In defense of HARKing. Industrial and Organizational Psychology. 11(1), 73–80. https://doi.org/10.1017/iop.2017.89

Wilson, B. M., & Wixted, J. T. (2018). The prior odds of testing a true effect in cognitive and social psychology. Advances in Methods and Practices in Psychological Science, 1(2), 186–197. https://doi.org/10.1177/2515245918767122 


Hollenbeck, J. R., & Wright, P. M. (2017). Harking, sharking, and tharking: Making the case for post hoc analysis of scientific data. Journal of Management, 43(1), 5-8. https://doi.org/10.1177/0149206316679487


Rubin, M. (2017). An evaluation of four solutions to the forking paths problem: Adjusted alpha, preregistration, sensitivity analyses, and abandoning the Neyman-Pearson approach. Review of General Psychology, 21(4), 321-329. https://doi.org/10.1037/gpr0000135  


Rubin, M. (2017). Do p values lose their meaning in exploratory analyses? It depends how you define the familywise error rate. Review of General Psychology, 21(3), 269-275. https://doi.org/10.1037/gpr0000123  


Rubin, M. (2017). When does HARKing hurt? Identifying when different types of undisclosed post hoc hypothesizing harm scientific progress. Review of General Psychology, 21(4), 308-320. https://doi.org/10.1037/gpr0000128


Fiedler, K., & Schwarz, N. (2016). Questionable research practices revisited. Social Psychological and Personality Science, 7(1), 45-52. https://doi.org/10.1177/1948550615612150

Open Data

Khan, S., Hirsch, J. S., & Zubida, O. Z. (2024). A dataset without a code book: Ethnography and open science. Frontiers in Sociology, 9, Article 1308029. https://doi.org/10.3389/fsoc.2024.1308029 

Lamb, D., Russell, A., Morant, N., & Stevenson, F. (2024). The challenges of open data sharing for qualitative researchers. Journal of Health Psychology. 29(7):659-664. https://doi.org/10.1177/13591053241237620 

Prosser, A. M., Bagnall, R., Higson-Sweeney, N. (2024). Reflection over compliance: Critiquing mandatory data sharing policies for qualitative research. Journal of Health Psychology, 62(4), 1635-1653. https://doi.org/10.1177/13591053231225903 

Prosser, A. M. B., Hamshaw, R., Meyer, J., Bagnall, R., Blackwood, L., Huysamen, M., ... & Walter, Z. (2023). When open data closes the door: Problematising a one size fits all approach to open data in journal submission guidelines. British Journal of Social Psychology, 62(4), 1635-1653. https://doi.org/10.1111/bjso.12576 

Berberi, I., & Roche, D. G. (2022). No evidence that mandatory open data policies increase error correction. Nature Ecology & Evolution, 6(11), 1630-1633. https://doi.org/10.1038/s41559-022-01879-9

Khalil, A. T., Shinwari, Z. K., & Islam, A. (2022). Fostering openness in open science: An ethical discussion of risks and benefits. Frontiers in Political Science, 4, 930574. https://doi.org/10.3389/fpos.2022.930574 

Jacobs, A., Büthe, T., Arjona, A., Arriola, L., Bellin, E., Bennett, A.,...Yashar, D. (2021). The qualitative transparency deliberations: Insights and implications. Perspectives on Politics, 19(1), 171-208. https://doi.org/10.1017/S1537592720001164 

Tsai, A. C., Kohrt, B. A., Matthews, L. T., Betancourt, T. S., Lee, J. K., Papachristos, A. V., ... & Dworkin, S. L. (2016). Promises and pitfalls of data sharing in qualitative research. Social Science & Medicine, 169, 191-198. https://doi.org/10.1016/j.socscimed.2016.08.004 

Preregistration

Souza-Neto, V., & Moyle, B. (2025). Preregistration is not a panacea, but why? A rejoinder to “infusing preregistration into tourism research”. Tourism Management, 107, Article 105061. https://doi.org/10.1016/j.tourman.2024.105061 

Klonsky, E. D. (2024). Campbell’s law explains the replication crisis: Pre-registration badges are history repeating. Assessment. https://doi.org/10.1177/10731911241253430 

Klonsky, E. D. (2024). How to produce, identify, and motivate robust psychological science: A roadmap and a response to Vize et al. Assessment. https://doi.org/10.1177/10731911241299723 

Rubin, M. (2024). Preregistration does not improve the transparent evaluation of severity in Popper’s philosophy of science or when deviations are allowed. arXiv. https://doi.org/10.48550/arXiv.2408.12347

Syed, M. (2023, December 8). Some data indicating that editors and reviewers do not check preregistrations during the review process. PsyArXiv. https://doi.org/10.31234/osf.io/nh7qw 

McDermott, R. (2022). Breaking free: How preregistration hurts scholars and science. Politics and the Life Sciences, 41(1), 55-59.  https://doi.org/10.1017/pls.2022.4 

Pham, M. T., & Oh, T. T. (2021). Preregistration is neither sufficient nor necessary for good science. Journal of Consumer Psychology, 31(1), 163-176. https://doi.org/10.1002/jcpy.1209

Pham, M. T., & Oh, T. T. (2021). On not confusing the tree of trustworthy statistics with the greater forest of good science: A comment on Simmons et al.’s perspective on pre‐registration. Journal of Consumer Psychology, 31(1), 181-185. https://doi.org/10.1002/jcpy.1213 

Navarro, D. (2020, September 23). Paths in strange spaces: A comment on preregistration. PsyArXiv. https://doi.org/10.31234/osf.io/wxn58


Rubin, M. (2020). Does preregistration improve the credibility of research findings? The Quantitative Methods for Psychology, 16(4), 376–390. https://doi.org/10.20982/tqmp.16.4.p376


Szollosi, A., Kellen, D., Navarro, D. J., Shiffrin, R., van Rooij, I., Van Zandt, T., & Donkin, C. (2020). Is preregistration worthwhile? Trends in Cognitive Science, 24(2), 94-95. https://doi.org/10.1016/j.tics.2019.11.009


Lewandowsky, S. (2019, January 22). Avoiding Nimitz Hill with more than a Little Red Book: Summing up #PSprereg. Psychonomic Society. https://featuredcontent.psychonomic.org/avoiding-nimitz-hill-with-more-than-a-little-red-book-summing-up-psprereg/


MacEachern, Steven N., & Van Zandt, T. (2019). Preregistration of modeling exercises may not be useful. Computational Brain & Behavior, 2, 179-182. https://doi.org/10.1007/s42113-019-00038-x 


Morey, R. (2019). You must tug that thread: Why treating preregistration as a gold standard might incentivize poor behavior. Psychonomic Society. https://featuredcontent.psychonomic.org/you-must-tug-that-thread-why-treating-preregistration-as-a-gold-standard-might-incentivize-poor-behavior/


Oberauer, K. (2019, January 15). Preregistration of a forking path – What does it add to the garden of evidence? Psychonomic Society. https://featuredcontent.psychonomic.org/preregistration-of-a-forking-path-what-does-it-add-to-the-garden-of-evidence/


Shiffrin, R. (2019). Complexity of science v. #PSprereg? Psychonomic Society. https://featuredcontent.psychonomic.org/complexity-of-science-v-psprereg/


Ledgerwood, A. (2018). The preregistration revolution needs to distinguish between predictions and analyses. Proceedings of the National Academy of Sciences, 115(45), E10516-E10517. https://doi.org/10.1073/pnas.1812592115 


Scott, S. (2013, July 25). Pre-registration would put science in chains. Times Higher Education. https://www.timeshighereducation.com/comment/opinion/pre-registration-would-put-science-in-chains/2005954.article 


Lash, T. L., & Vandenbroucke, J. P. (2012). Commentary: Should preregistration of epidemiologic study protocols become compulsory?: Reflections and a counterproposal. Epidemiology, 23(2), 184-188. https://doi.org/10.1097/EDE.0b013e318245c05b 

Epistemic & Disciplinary Diversity

Archer, R. (2024). Retiring Popper: Critical realism, falsificationism, and the crisis of replication. Theory & Psychology. https://doi.org/10.1177/09593543241250079 

Hostler, T. J. (2024). Research assessment using a narrow definition of “research quality” is an act of gatekeeping: A comment on Gärtner et al. (2022). Meta-Psychology, 8. https://doi.org/10.15626/MP.2023.3764 

Pownall, M. (2024). Is replication possible for qualitative research? A response to Makel et al. (2022). Educational Research and Evaluation, 29(1–2), 104–110.  https://doi.org/10.1080/13803611.2024.2314526 

Prosser, A. M., Brown, O., Augustine, G., & Ellis, D. (2024). It’s time to join the conversation: Visions of the future for qualitative transparency and openness in management and organisation studies. SocArXiv. https://osf.io/preprints/socarxiv/ntf73 

Ulpts, S. (2024). Responsible assessment of what research? Beware of epistemic diversity! Meta-Psychology. https://doi.org/10.15626/MP.2023.3797 

Hicks, D. J. (2023). Open science, the replication crisis, and environmental public health. Accountability in Research, 30(1), 34-62. https://doi.org/10.1080/08989621.2021.1962713 

Leonelli, S. (2023). Philosophy of open science. Cambridge University Press. https://www.cambridge.org/core/elements/philosophy-of-open-science/0D049ECF635F3B676C03C6868873E406 

Liu, M. (2023). Whose open science are we talking about? From open science in psychology to open science in applied linguistics. Language Teaching, 1-8. https://doi.org/10.1017/S0261444823000307 

Steltenpohl, C. N., Lustick, H., Meyer, M. S., Lee, L. E., Stegenga, S. M., Reyes, L. S., & Renbarger, R. L. (2023). Rethinking transparency and rigor from a qualitative open science perspective. Journal of Trial & Error, 4(1). https://doi.org/10.36850/mr7 

Bazzoli, A. (2022). Open science and epistemic pluralism: A tale of many perils and some opportunities. Industrial and Organizational Psychology, 15(4), 525-528. https://doi.org/10.1017/iop.2022.67

Guzzo, R. A., Schneider, B., & Nalbantian, H. R. (2022). Open science, closed doors: The perils and potential of open science for research in practice. Industrial and Organizational Psychology: Perspectives on Science and Practice, 15, 495–515. https://doi.org/10.1017/iop.2022.61 

Lash, T. L. (2022). Getting over TOP. Epidemiology, 33(1), 1-6. https://doi.org/10.1097/EDE.0000000000001424 

Leonelli, S. (2022). Open science and epistemic diversity: Friends or foes? Philosophy of Science, 89(5), 991-1001. https://doi.org/10.1017/psa.2022.45 

Malich, L., & Rehmann-Sutter, C. (2022). Metascience is not enough - A plea for psychological humanities in the wake of the replication crisis. Review of General Psychology, 26(2), 261-273. https://doi.org/10.1177/10892680221083876

Bennett, E. A. (2021). Open science from a qualitative, feminist perspective: Epistemological dogmas and a call for critical examination. Psychology of Women Quarterly, 45(4), 448-456. https://doi.org/10.1177/03616843211036460


Field, S. M., & Derksen, M. (2021). Experimenter as automaton; experimenter as human: Exploring the position of the researcher in scientific research. European Journal for Philosophy of Science, 11, Article 11. https://doi.org/10.1007/s13194-020-00324-7

Guttinger, S. (2020). The limits of replicability. European Journal for Philosophy of Science, 10(2), 1-17. https://doi.org/10.1007/s13194-019-0269-1


Pratt, M. G., Kaplan, S., & Whittington, R. (2020). The Tumult over transparency: Decoupling transparency from replication in establishing trustworthy qualitative research. Administrative Science Quarterly, 65(1), 1-19. https://doi.org/10.1177/0001839219887663 


Devezer, B., Nardin, L. G., Baumgaertner, B., & Buzbas, E. O. (2019). Scientific discovery in a model-centric framework: Reproducibility, innovation, and epistemic diversity. PloS one, 14(5), Article e0216125. https://doi.org/10.1371/journal.pone.0216125


Drummond, C. (2019). Is the drive for reproducible science having a detrimental effect on what is published? Learned Publishing, 32(1), 63-69. https://doi.org/10.1002/leap.1224


Penders, B., Holbrook, J. B., & de Rijcke, S. (2019). Rinse and repeat: Understanding the value of replication across different ways of knowing. Publications, 7(3), 52. https://doi.org/10.3390/publications7030052 


Wiggins, B. J., & Christopherson, C. D. (2019). The replication crisis in psychology: An overview for theoretical and philosophical psychology. Journal of Theoretical and Philosophical Psychology, 39(4), 202–217. https://doi.org/10.1037/teo0000137


Leonelli, S. (2018). Rethinking reproducibility as a criterion for research quality. Including a symposium on Mary Morgan: Curiosity, imagination, and surprise. Research in the History of Economic Thought and Methodology, 36B (pp. 129-146). Emerald Publishing. https://doi.org/10.1108/S0743-41542018000036B009


Levin, N., & Leonelli, S. (2017). How does one “open” science? Questions of value in biological research. Science, Technology, & Human Values, 42(2), 280-305. https://doi.org/10.1177/0162243916672071

Theory & Theory Development

Dames, H., Musfeld, P., Popov, V., Oberauer, K., & Frischkorn, G. T. (2024). Responsible research assessment should prioritize theory development and testing over ticking open science boxes. Meta-Psychology. https://doi.org/10.15626/MP.2023.3735 

Guest, O. (2024). What makes a good theory, and how do we make a theory good? Computational Brain & Behavior. https://doi.org/10.1007/s42113-023-00193-2 

Field, S. M., Volz, L., Kaznatcheev, A., Kaznatcheev, A., & van Dongen, N. (2024). Can a good theory be built using bad ingredients?. Computational Brain & Behavior. https://doi.org/10.1007/s42113-024-00220-w 

Hutmacher, F., & Franz, D. J. (2024). Approaching psychology’s current crises by exploring the vagueness of psychological concepts: Recommendations for advancing the discipline. American Psychologist. https://doi.org/10.1037/amp0001300 

Jost, J. T. (2024). Grand challenge: Social psychology without hubris. Frontiers in Social Psychology, 1, Article 1283272. https://doi.org/10.3389/frsps.2023.1283272 

Devezer, B., & Buzbas, E. O. (2023). Rigorous exploration in a model-centric science via epistemic iteration. Journal of Applied Research in Memory and Cognition, 12(2), 189–194. https://doi.org/10.1037/mac0000121 

Lavelle, J. S. (2023, October 2). Growth from uncertainty: Understanding the replication 'crisis' in infant psychology. PhilSci Archive. https://philsci-archive.pitt.edu/22679/ 

Guest, O., & Martin, A. E. (2021). How computational modeling can force theory building in psychological science. Perspectives on Psychological Science, 16(4), 789–802. https://doi.org/10.1177/1745691620970585 

Proulx, T., & Morey, R. D. (2021). Beyond statistical ritual: Theory in psychological science. Perspectives on Psychological Science, 16(4), 671-681. https://doi.org/10.1177/17456916211017098

Szollosi, A., & Donkin, C. (2021). Arrested theory development: The misguided distinction between exploratory and confirmatory research. Perspectives on Psychological Science, 16(4), 717-724. https://doi.org/10.1177/1745691620966796

Wentzel, K. R. (2021). Open science reforms: Strengths, challenges, and future directions. Educational Psychologist, 56(2), 161-173. https://doi.org/10.1080/00461520.2021.1901709

Oberauer, K., & Lewandowsky, S. (2019). Addressing the theory crisis in psychology. Psychonomic Bulletin & Review, 26(5), 1596-1618. https://doi.org/10.3758/s13423-019-01645-2

Stroebe, W. (2019). What can we learn from many labs replications? Basic and Applied Social Psychology, 41(2), 91-103. https://doi.org/10.1080/01973533.2019.1577736 

van Rooij, I. (2019). Psychological science needs theory development before preregistration. Psychonomic Society. https://featuredcontent.psychonomic.org/psychological-science-needs-theory-development-before-preregistration/ 

Fiedler, K. (2018). The creative cycle and the growth of psychological science. Perspectives on Psychological Science, 13(4), 433-438. https://doi.org/10.1177/1745691617745651 

Schaller, M. (2016). The empirical benefits of conceptual rigor: Systematic articulation of conceptual hypotheses can reduce the risk of non-replicable results (and facilitate novel discoveries too). Journal of Experimental Social Psychology, 66, 107-115. https://doi.org/10.1016/j.jesp.2015.09.006 

Trafimow, D., & Earp, B. D. (2016). Badly specified theories are not responsible for the replication crisis in social psychology: Comment on Klein. Theory & Psychology, 26(4), 540–548. https://doi.org/10.1177/0959354316637136 

Heterogeneity, Context Sensitivity, & Exploratory Research

Holzmeister, F., Johannesson, M., Böhm, R., Dreber, A., Huber, J., & Kirchler, M. (2024). Heterogeneity in effect size estimates. Proceedings of the National Academy of Sciences, 121(32), e2403490121. https://doi.org/10.1073/pnas.2403490121 

Iso-Ahola, S. E. (2024). Science of psychological phenomena and their testing. American Psychologist. https://doi.org/10.1037/amp0001362 

Maziarz, M. (2024). Conflicting results and statistical malleability: Embracing pluralism of empirical results. Perspectives on Science, 32(6), 701-728. https://doi.org/10.1162/posc_a_00627

Phaf, R. H. (2024). Positive deviance underlies successful science: Normative methodologies risk throwing out the baby with the bathwater. Review of General Psychology. https://doi.org/10.1177/10892680241235120

Rubin, M., & Donkin, C. (2024). Exploratory hypothesis tests can be more compelling than confirmatory hypothesis tests. Philosophical Psychology, 37(8), 2019-2047. https://doi.org/10.1080/09515089.2022.2113771 

Devezer, B., & Buzbas, E. O. (2023). Rigorous exploration in a model-centric science via epistemic iteration.Journal of Applied Research in Memory and Cognition, 12(2), 189–194. https://doi.org/10.1037/mac0000121 

Schwartz, B. (2023, March 20). Psychology’s increased rigor is good news. But is it only good news? Behavioural Scientist. https://behavioralscientist.org/psychologys-increased-rigor-is-good-news-but-is-it-only-good-news/  

Jacobucci, R. (2022). A critique of using the labels confirmatory and exploratory in modern psychological research. Frontiers in Psychology, 13, Article 1020770. https://doi.org/10.3389/fpsyg.2022.1020770 

Bryan, C. J., Tipton, E., & Yeager, D. S. (2021). Behavioural science is unlikely to change the world without a heterogeneity revolution. Nature Human Behaviour, 5(8), 980-989. https://doi.org/10.1038/s41562-021-01143-3 


Finkel, E. J., Eastwick, P. W., & Reis, H. T. (2017). Replicability and other features of a high-quality science: Toward a balanced and empirical approach. Journal of Personality and Social Psychology, 113(2), 244-253. http://dx.doi.org/10.1037/pspi0000075 


Greenfield, P. M. (2017). Cultural change over time: Why replicability should not be the gold standard in psychological science. Perspectives on Psychological Science, 12(5), 762-771. https://doi.org/10.1177/1745691617707314


Fiedler, K., Kutzner, F., & Krueger, J. I. (2012). The long way from α-error control to validity proper: Problems with a short-sighted false-positive debate. Perspectives on Psychological Science, 7(6), 661-669. https://doi.org/10.1177/1745691612462587 

Researcher Equity, Diversity, & Inclusion

Hostler, T. J. (2023). The invisible workload of open research. Journal of Trial & Error. https://doi.org/10.36850/mr5 


Fox Tree, J., Lleras, A., Thomas, A., & Watson, D. (2022, August 30). The inequitable burden of open science. Psychonomic Society Featured Content. https://featuredcontent.psychonomic.org/the-inequitable-burden-of-open-science/


Kessler, A., Likely, R., & Rosenberg, J. M. (2021). Open for whom? The need to define open science for science education. Journal of Research in Science Teaching, 58(10), 1590-1595. https://doi.org/10.1002/tea.21730


Bahlai, C., Bartlett, L. J., Burgio, K. R., Fournier, A. M., Keiser, C. N., Poisot, T., & Whitney, K. S. (2019). Open science isn’t always open to all scientists. American Scientist, 107(2), 78-82. http://dx.doi.org/10.1511/2019.107.2.78 

Bropenscience & Tone

Darda, K. M., Conry-Murray, C., Schmidt, K., Elsherif, M. M., Peverill, M., Yoneda, T., … Gernsbacher, M. (2023, October 29). Promoting civility in formal and informal open science contexts. PsyArXiv. https://doi.org/10.31234/osf.io/rfkyu 

Derksen, M., & Field, S. (2022). The tone debate: Knowledge, self, and social order. Review of General Psychology, 26(2), 172-183. https://doi.org/10.1177/10892680211015636

Pownall, M., & Hoerst, C. (2022). Slow science in scholarly critique. The Psychologist, 35, 2. https://thepsychologist.bps.org.uk/volume-35/february-2022/slow-science-scholarly-critique

Anonymous. (2021, November 25). It’s 2021… and we are still dealing with misogyny in the name of open science. University of Sussex School of Psychology Blog. https://blogs.sussex.ac.uk/psychology/2021/11/25/its-2021-and-we-are-still-dealing-with-misogyny-in-the-name-of-open-science/

Whitaker, K., & Guest, O. (2020). # bropenscience is broken science. The Psychologist, 33, 34-37. https://thepsychologist.bps.org.uk/volume-33/november-2020/bropenscience-broken-science

Hamlin, J. K. (2017). Is psychology moving in the right direction? An analysis of the evidentiary value movement. Perspectives on Psychological Science, 12(4), 690-693. https://doi.org/10.1177/1745691616689062

Fiske, S. T. (2016, October 31). A call to change science’s culture of shaming. APS Observer, 29. https://www.psychologicalscience.org/observer/a-call-to-change-sciences-culture-of-shaming

The Credibility of Metascientific Research

Auspurg, K., & Brüderl, J. (2024). Toward a more credible assessment of the credibility of science by many-analyst studies. Proceedings of the National Academy of Sciences, 121(38), e2404035121. https://doi.org/10.1073/pnas.2404035121 

Bak-Coleman, J. B., & Devezer, B. (2024). Claims about scientific rigour require rigour. Nature Human Behavior. https://doi.org/10.1038/s41562-024-01982-w '

Dudda, L., Kormann, E., Kozula, M., DeVito, N. J., Klebel, T., Dewi, A. P. M., … Leeflang, M. (2024, June 17). Open science interventions to improve reproducibility and replicability of research: A scoping review preprint. MetaArXiv. https://doi.org/10.31222/osf.io/a8rmu 

Dal Santo, T., Rice, D. B., Amiri, L. S., Tasleem, A., Li, K., Boruff, J. T., .Geoffroy, M.-C., Benedetti, A., & Thombs, B. D. (2023). Methods and results of studies on reporting guideline adherence are poorly reported: A meta-research study. Journal of Clinical Epidemiology. https://doi.org/10.1016/j.jclinepi.2023.05.017

Penders, B. (2024, November 26). Renovating the theatre of persuasion. ManyLabs as collaborative prototypes for the production of credible knowledge. PsyArXivhttps://doi.org/10.31222/osf.io/vhmk2 

Schimmelpfennig, R., Spicer, R., White, C., Gervais, W. M., Norenzayan, A., Heine, S., … Muthukrishna, M. (2023, February 9). A problem in theory and more: Measuring the moderating role of culture in Many Labs 2. PsyArxiv. https://psyarxiv.com/hmnrx/

Baumeister, R. F., Tice, D. M., & Bushman, B. J. (2022). A review of multisite replication projects in social psychology: Is it viable to sustain any confidence in social psychology’s knowledge base? Perspectives on Psychological Science, 18(4), 912-935. https://doi.org/10.1177/17456916221121815 

Lohmann, A., Astivia, O. L., Morris, T. P., & Groenwold, R. H. (2022). It's time! Ten reasons to start replicating simulation studies. Frontiers in Epidemiology, 2, 973470. https://doi.org/10.3389/fepid.2022.973470 


Bryan, C. J., Yeager, D. S., & O’Brien, J. M. (2019). Replicator degrees of freedom allow publication of misleading failures to replicate. Proceedings of the National Academy of Sciences, 116(51), 25535-25545. https://doi.org/10.1073/pnas.1910951116 


Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016). Comment on “Estimating the reproducibility of psychological science”. Science, 351(6277), 1037-1037. https://doi.org/10.1126/science.aad7243  


Gilbert, D. T., King, G., Pettigrew, S., & Wilson, T. D. (2016). More on “Estimating the reproducibility of psychological science'”. https://gking.harvard.edu/files/gking/files/gkpw_post_publication_response.pdf 


Devezer, B., Navarro, D. J., Vandekerckhove, J., & Ozge Buzbas, E. (2021). The case for formal methodology in scientific reform. Royal Society Open Science, 8(3), Article 200805. https://doi.org/10.1098/rsos.200805

The Science Reform Movement

Hostler, T. J. (2024). Open research reforms and the capitalist university: Areas of opposition and alignment. Collabra: Psychology, 10(1), 121383. https://doi.org/10.1525/collabra.121383 

Penders, B. (2024). Scandal in scientific reform: The breaking and remaking of science. Journal of Responsible Innovation, 11(1). https://doi.org/10.1080/23299460.2024.2371172 

Devezer, B., & Penders, B. (2023). Scientific reform, citation politics and the bureaucracy of oblivion. Quantitative Science Studies. https://doi.org/10.1162/qss_c_00274 

Peterson, D., & Panofsky, A. (2023). Metascience as a scientific social movement. Minerva. https://doi.org/10.1007/s11024-023-09490-3

Rubin, M. (2023). Questionable metascience practices. Journal of Trial & Error, 4(1), 5–20. https://doi.org/10.36850/mr4

Flis, I. (2022). The function of literature in psychological science. Review of General Psychology, 26(2), 146-156. https://doi.org/10.1177/10892680211066466

Penders, B. (2022). Process and bureaucracy: Scientific reform as civilisation. Bulletin of Science, Technology & Society, 42(4), 107-116. https://doi.org/10.1177/02704676221126388 

Morawski, J. (2022). How to true psychology’s objects. Review of General Psychology, 26(2), 157-171. https://doi.org/10.1177/10892680211046518

Bastian, H. (2021, October 31). The metascience movement needs to be more self-critical. PLOS Blogs: Absolutely Maybe. https://absolutelymaybe.plos.org/2021/10/31/the-metascience-movement-needs-to-be-more-self-critical/

Gervais, W. M. (2021). Practical methodological reform needs good theory. Perspectives on Psychological Science, 16(4), 827-843. https://doi.org/10.1177/1745691620977471

Peterson, D., & Panofsky, A. (2021). Arguments against efficiency in science. Social Science Information, 60(3), 350-355. https://doi.org/10.1177/05390184211021383

Andreoletti, M. (2020). Replicability crisis and scientific reforms: Overlooked issues and unmet challenges. International Studies in the Philosophy of Science, 33(3), 135-151. https://doi.org/10.1080/02698595.2021.1943292


Derksen, M. (2019). Putting Popper to work. Theory & Psychology, 29(4), 449-465. https://doi.org/10.1177/0959354319838343


Flis, I. (2019). Psychologists psychologizing scientific psychology: An epistemological reading of the replication crisis. Theory & Psychology, 29(2), 158-181. https://doi.org/10.1177/0959354319835322


Mirowski, P. (2018). The future(s) of open science. Social Studies of Science, 48(2), 171-203. https://doi.org/10.1177/0306312718772086 


Guest, O. (2016). Crisis in what exactly? The Winnower. https://doi.org/10.15200/winn.146590.01538 

Further Information

Reference for this Resource

Rubin, M. (2024). Critical metascience articles. https://sites.google.com/site/markrubinsocialpsychresearch/replication-crisis/list-of-articles-critical-of-open-science 


Associated Research Rabbit Collection

https://www.researchrabbitapp.com/collection/public/V467G42MLX 

Critical Metascience Blog