Research Thrust 1

Brain Connectivity Modeling using Neuroimaging Data: Human brain is the most complex system known to human being. How the brain system efficiently and robustly support the neural computation and information flow is still an open question. Higher function (language, conscious recognition, etc.) is neither the result of activity strictly localized in specific neural structures, nor of the brain as a whole, but emerges from joint dynamics in distributed cortical regions, each relatively specialized for one or more aspects of the function. It is at the systems level that anatomy, physiology, and adaptive function come into correspondence. The composition of such systems is fundamentally constrained by patterns of anatomical connectivity that connect different cortical centers, but the systems’ architecture shifts dynamically so that individual cortical regions are involved in multiple distributed systems. On the other hand, modern neuroimaging techniques have enabled high-throughput measurements and characterization of the function and structure of the brain. Commonly used neuroimaging modalities include Magnetic Resonance Imaging (MRI), Positron Emission Tomography (PET), Diffusion Tensor Imaging (DTI), and functional MRI (fMRI). Neuroimaging has been found to be a powerful method with enormous implications on both scientific discovery and clinical applications, such as understanding how the brain structure support the cognitive functions, how to identify brain regions disrupted by neurodegenerative diseases, how the disease processes such as Alzheimer’s Disease disrupt the functions, how to monitor the disease progression, and how to evaluate the treatment effect as a more sensitive and reliable index than conventional subjective cognitive measurements, etc. We are interested on the modeling and analysis of the brain systems using multi-modality neuroimaging data, focusing on system-level understanding and innovations in machine learning and statistical analysis, to create an analytic framework that can convert the high-dimensional and noisy neuroimaging data into scientific knowledge and better clinical practices.

Collaborators

Integrated Brain Imaging Center at UW: http://www.ibic.washington.edu,

Banner Alzheimer's Institute: http://www.banneralz.org/

Byrd Alzheimer's Institute: http://health.usf.edu/byrd/

Towards A Mechanistic Understanding of Type 1 Diabetes: The rapidly increasing incidence of Type 1 Diabetes (T1D) over the past decade has emerged as a global issue. The well-known natural history model, proposed by Dr. George Eisenbarth, highlights the potential of understanding the etiology of T1D and improving it’s prediction. The model suggests more focus into the involved factors in disease progression, such as genetic background, early-life environmental exposures, and immunological markers, and highlights a promising opportunity window of preventing T1D from progressing from early stages to disease onset. However, there is still a lack of understanding regarding the pronounced inter-individual variation in the subclinical prodrome. To attain better understanding of T1D, a number of large-scale T1D studies, such as The Environmental Determinants of Diabetes in the Young (TEDDY) study, have been launched to thoroughly collect longitudinal repeated measurements (from genotyping, environmental exposures to immunologic as well as metabolomics measurements) from a large number of subjects. Although Eisenbarth’s model has served as the crucial framework for these studies and spurred many hypothesis-driven statistical analyses, the pace of translating such “big” T1D data for clinical prognostics of T1D has been staggering. Given the complexity of T1D process, the collected longitudinal data in TEDDY are essentially naturalistic observations from a dynamic system, rather than the experimental measurements that are taken from a well-defined homogeneous population. The success of many existing statistical models, such as the Cox proportional hazards model and its extensions with time-varying covariates, build on the homogeneity assumption, on which a central tendency can effectively establish the population-level characteristics and covariates are sufficient to characterize the individual variation as derivation from the center. In order to systematically analyze the available big TEDDY data, we are developing new analysis approaches that are capable of integrating the temporal changes of critical markers and interactions of potential risk factors, handling mixed types of measurements, and characterizing the disease dynamics, all of which are critical components of understanding a dynamic system.

The figure above illustrates the conceptual framework we are taking to model the T1D via a systematical approach

Collaborators

Pacific Northwest Diabetes Research Institute: http://www.pnri.org/,

TEDDY Study Group: http://teddy.epi.usf.edu/,

Benaroya Research Institute: https://www.benaroyaresearch.org/