Hoces de la Guardia, F., Miguel, E., Pathan, G., Silva da Rocha, V., Shaji, A., Sørensen, E., Tungodden, B.
Abstract.
What scientists choose not to publish can be as important as what they do publish. This study introduces a novel measure of the reporting of research results in economics, and investigates the causes behind under-reporting. Across over 5,000 pre-registered hypotheses from 318 studies on the American Economic Association RCT Trial Registry that were encoded, we first show that only 42% of hypotheses have publicly available results 8 to 9 years after pre-registration. Around two-thirds of these publicly available findings are null results (i.e., not statistically significant), contrary to widespread concerns about the under-reporting of null results. An RCT was carried out to test three plausible barriers to full reporting, namely, a lack of awareness, engagement, or resources. An intervention in which our team engaged with the authors' research and their under-reported findings led them to provide new estimates for 10% of the missing pre-registered hypotheses. Authors also provided explanations for an additional 16% of the missing hypotheses (cases where interventions or data collection never occurred). In contrast, neither a light-touch "awareness" intervention nor a more expensive intervention providing research assistant resources led to additional reporting. Together, the findings indicate that the majority of pre-registered research hypotheses in economics are currently not reported to the community; that under-reporting is more complex than a simple reluctance to report nulls; and that relatively low-cost engagement with authors can considerably boost reporting of "missing" findings. We conclude that there is still need for stronger incentives and infrastructure to ensure that pre-registered scientific plans translate into publicly accessible evidence.
"Assessing Reproducibility in Economics Using Standardized Crowd-sourced Analysis"; Abel Brodeur ⓡ Seung Yong Sung ⓡ Edward Miguel ⓡ Lars Vilhuber ⓡ Fernando Hoces de la Guardia. https://www.nber.org/papers/w33753
"The Reproducibility and Robustness of Economics and Political Science"; Brodeur, A., Mikola, D., Cook, N, Fiala, L. ..., Hoces de la Guardia, F ..... 2025 (Nature R &R)
‘Promoting reproducibility and replicability in political science”; Brodeur, A., Esterling, K., Ankel-Peters, J., Bueno, N., Desposato, S., Dreber, A., Genovese, F., Green, D., Hepplewhite, M., Hoces de la Guardia, F., et. al.; Research & Politics, 2024.
“Reproduction and replication at scale”; Brodeur, A., Dreber A., Hoces de la Guardia, F., Miguel, E.; Nature Human Behaviour, Correspondence, 2024.
“Replication games: how to make reproducibility research more systematic”; Brodeur, A., Dreber A., Hoces de la Guardia, F., Miguel, E.; Nature, Comment, 2023.
“A Framework for Open Policy Analysis”, F.Hoces De La Guardia, S. Grant, E. Miguel; Science and Public Policy 48.2;154-163; 2021. tinyurl.com/1qypbihb.
“A consensus-based transparency checklist”; Aczel, B., Szaszi, B., Sarafoglou, A., Kekecs, Z., Kucharský, Š., Benjamin, D., ... F. Hoces de la Guardia..., & Ioannidis, J. P; Nature human behavior; 2019.1-3.
During & Pre-PhD:
“Loss Function-based Evaluation of Physician Report Cards”; F. Hoces de la Guardia, J. Hwang, JL Adams, SM. Paddock. Health Services and Outcomes Research Methodology; 2018.
“Optimizing Variance-Bias Trade-off in the TWANG Package for Estimation of Propensity Scores”; L. Parast, D. McCaffrey, L. Burgette, F. Hoces de la Guardia, D. Golinelli, JNV Miles, BA Griffin; Health Services and Outcomes Research Methodology; 2017.
“Better-than-average and worse-than-average hospitals may not significantly differ from average hospitals: an analysis of Medicare Hospital Compare ratings”; SM Paddock, JL Adams, F. Hoces de la Guardia; BMJ quality & safety; 2015.
“Evaluating the Chile Solidario Program: Results Using the Chile Solidario Panel and the Administrative Databases”, F.Hoces De La Guardia, A. Hojman, O. Larranaga; Revista Estudios de Economia, 2011.