ctGATE: Clinical Trial Generalizability Assessment Informatics Toolbox

Project Summary

Clinical studies are often conducted under idealized and rigorously controlled conditions to improve their internal validity and success rates, but compromise their external validity (i.e., generalizability to the target populations). These idealized conditions are sometimes exaggerated and reflected as overly restrictive eligibility criteria. Certain population subgroups are often excluded with unjustified criteria and are subsequently underrepresented. Older adults have been especially underrepresented in cancer studies. The underrepresentation of these population subgroups reduces the treatment effects and increases the likelihood of adverse outcomes in diverse populations when the interventions were moved into clinical practice. It is imperative to rigorously assess the generalizability of a clinical study, so that stakeholders including pharmaceutical companies, policymakers, providers, and patients would be able to understand and anticipate the possible effects of the interventions in the real world. In the past two decades, a large number of studies have assessed generalizability, but mostly were after the fact, ad hoc, not systematic, and focused on specific diseases and sets of trials without a formalized approach. So far, there is a significant knowledge gap between the available methods for generalizability assessment and their adoption in research practice. Most generalizability assessments have been conducted as an ad hoc auditing effort by a third party after the fact. We believe the key barriers are two-fold: (1) the lack of evidence to demonstrate their validity, which also leads to the lack of consensus on the best practice for generalizability assessments; and (2) the lack of readily available, well-vetted statistical and informatics tools. Motivated to fill this gap, we propose to first systematically review the extant methods for generalizability assessments, and then use a data-driven strategy to reproduce, evaluate, and compare these methods with our unique data resource, the OneFlorida Data, one of the 13 PCORI-funded Clinical Data Research Networks that contains linked EHRs, claims, and cancer registry data for ~15 million Floridians. We will develop an open-source generalizability assessment software toolbox and its accompanying documentations and tutorials. The success of this R21 project will (1) fill a knowledge gap on the validity and utility of the different generalizability assessment methods; (2) provide an easy-to-use toolbox ctGATE for assessing study generalizability much-needed by the clinical research community; (3) help the clinical researchers choose the most appropriate generalizability assessment methods with readily available implementations; and (4) build a body of evidence to support the development of an eligibility criteria design tool for optimizing study generalizability at the study design phase.

Clinical Trial Generalizability Assessment Toolbox (ctGATE) (https://ctgate.cci.fsu.edu/)

In the ctGATE tool, the user will be able to search relevant clinical trial generalizability assessment papers using the data source, disease category, types of generalizability assessment methods (score/non-score output, a priori / a posteriori generalizability assessment), PMID, and title. Then the filtered papers will be displayed in a table. The user can (1) click on the PMID to view the entry of the article in PubMed; (2) click on the title of the paper to view all the coded information about the study; and (3) view the R/Python tutorial for the generalizability assessment.

How did we build this tool?

Please see this abstract for detail.

Citation

He Z, Tang X, Yang X, Guo Y, George TJ, Charness N, Quan Hem KB, Hogan W, Bian J. Clinical Trial Generalizability Assessment in the Big Data Era: A Review. Clinical and Translational Science. 2020 Feb 14. https://doi.org/10.1111/cts.12764

Funding Support

This project is supported by National Institute on Aging of the U.S. National Institutes of Health under grant "Systematic Analysis of Clinical Study Generalizability Assessment Methods with Informatics" (R21AG061431)