Pilot Project 1
Contemporary Optimal Recovery
Team: S. Foucart (lead), I. Gaynanova, G. Paouris, N. Veldt
Synopsizing points
Revisit a venerable Approximation Theory topic through the modern themes of Data Science, e.g. computability, data variety, uncertainties, etc.
Reinforce algorithm trustworthiness by emphasizing worst-case over average-case scenarios, e.g. via principled hyperparameter selection
Compare the resulting analytical learning theory to the more popular statistical learning theory
Removal of any statistical assumption balanced by the need for a priori information (i.e., a model)
Related grants
NSF CDS&E-MSS: Optimal Recovery in the Age of Data Science, 2021-24 (PI: S. Foucart)
Recent relevant papers
S. Foucart. Full recovery from point values: an optimal algorithm for Chebyshev approximability prior. (arXiv)
S. Foucart, C. Liao. Optimal recovery from inaccurate data in Hilbert spaces: regularize, but what of the parameter? Constructive Approximation, to appear. (arXiv)
S. Foucart, C. Liao, S. Shahrampour, Y. Wang. Learning from non-random data in Hilbert spaces: an optimal recovery perspective. Sampling Theory, Signal Processing, and Data Analysis, 20, 5, 2022. (doi)
M. Ettehad, S. Foucart. Instances of computational optimal recovery: dealing with observation errors. SIAM/ASA Journal on Uncertainty Quantification, 9/4, 1438-1456, 2021. (doi)
S. Foucart. Instances of computational optimal recovery: refined approximability models. Journal of Complexity, 62, 101503, 2021. (doi)
Additional resources
GitHub repository: https://github.com/foucart/COR