Appraisal of: Alexander, R., Peterson, C. J., Yang, S., & Nugent, K. (2025). Article retraction rates in selected MeSH term categories in PubMed published between 2010 and 2020. Accountability in Research, 32(3), 263–276. https://doi.org/10.1080/08989621.2023.2272246
Reviewer(s):
Alan Lovell
Deirdre beecher
MS Copilot
Full Reference:
Alexander, R., Peterson, C. J., Yang, S., & Nugent, K. (2025). Article retraction rates in selected MeSH term categories in PubMed published between 2010 and 2020. Accountability in Research, 32(3), 263–276. https://doi.org/10.1080/08989621.2023.2272246
Short description:
This study explored article retraction rates across 15 clinical research topics indexed by PubMed MeSH terms between 2010 and 2020. Using PubMed searches, the authors collected data on total publications, retractions, retraction rates per 100,000 articles, and time to retraction. Statistical analyses included Fisher’s exact test, Kruskal–Wallis rank-sum test, and cumulative incidence plots to compare retraction patterns across disciplines.
Results showed significant variation in retraction rates and time to retraction among topics. The median time to retraction was 857 days, with delays exceeding three years in some cases. “Neoplasms” had the highest number of retractions (993), while “Stem Cells” had the highest retraction rate (110.4 per 100,000 publications). Topics such as “Climate Change” and “MMR Vaccine” had very low retraction counts. These findings highlight that research visibility and topic-specific scrutiny influence retraction patterns, with implications for research integrity and editorial practices.
Limitations stated by the author(s):
Delay in article retraction (often 1–3 years or more) means future retractions could alter results.
Analysis was limited to MEDLINE via PubMed; other databases might yield different results.
Use of selected MeSH terms may not represent all research fields; other terms could show different patterns.
Topics were chosen based on perceived public interest and Altimetric data, introducing selection bias.
Did not analyse repeat offenders or rank journals by retraction frequency.
Differences in publication rates and methodologies across disciplines may affect comparability.
Limitations stated by the reviewer(s):
Reliance on PubMed and MeSH terms may underrepresent multidisciplinary or non-indexed research.
The study does not explore reasons for retraction (fraud vs. error), limiting interpretability of integrity issues.
Manual exclusion of duplicates and zero-day retractions introduces potential human error.
Lack of longitudinal follow-up means trends beyond 2020 remain unknown.
Findings may not generalise to non-medical fields or journals with different editorial standards.
Study Type:
Bibliometric and methodological study (observational analysis of publication and retraction patterns)
Tags: