Sampling inequalities and optimal recovery in reproducing kernel Hilbert spaces
Sparse representation of covariance matrices
Approximation theory for Gaussian process models with non-Gaussian distributed observations
NSF CDS&E-MSS: Sparsely Activated Bayesian Neural Networks from Deep Gaussian Processes, 2023-26 (PI: Rui Tuo)
L. Ding, R. Tuo. A general theory for kernel packets: from state space model to compactly supported basis. (arXiv)
R. Tuo, L. Zou. Asymptotic theory for linear functionals of kernel ridge regression. (arXiv)
A. Prakash, R. Tuo, Y. Ding. Gaussian process aided function comparison using noisy scattered data. Technometrics, 64/1, 92-102, 2022. (doi)
H. Chen, L. Ding, R. Tuo. Kernel packet: an exact and scalable algorithm for Gaussian process regression with Matérn correlations. Journal of Machine Learning Research, 23/127, 1-32, 2022. (link)
R. Tuo, S. He, A. Pourhabib, Y. Ding, J. Z. Huang. A reproducing kernel Hilbert space approach to functional calibration of computer models. Journal of the American Statistical Association, 2021. (doi)
R. Tuo, W. Wang. Uncertainty quantification for Bayesian optimization. International Conference on Artificial Intelligence and Statistics. Proceedings of Machine Learning Research, 151, 2862-2884, 2022. (link)
A. Amir, D. Levin, F. J. Narcowich, J. D. Ward. Meshfree extrapolation with application to enhanced near-boundary approximation with local Lagrange kernels. Foundations of Computational Mathematics, 22, 1–34, 2022. (doi)