Pilot Project 3
Learning in Data-Poor Conditions
Team: F. Baudier, S. Foucart (lead), K. Narayanan, R. Tuo
Synopsizing points
Exploit some a priori structure to learn/recover from few data - this incorporates Compressive Sensing, where the specific structure is vector sparsity
Handle other structures such as low rank for matrices and tensors, variable reduction for multivariate functions, etc.
Active/directed learning to address the choice of the few datasites
In particular, uncover near-optimal choices via deterministic (not random) means - this constitutes a major mathematical challenge
Related grants
TAMU T3 Triads: Learning More Efficiently with Less Labels, 2020-22 (PI: Foucart, coPI: Tuo)
Recent relevant papers
S. Foucart. Linearly embedding sparse vectors from ℓ2 to ℓ1 via deterministic dimension-reducing maps. (arXiv)
S. Foucart. The sparsity of LASSO-type minimizers. Applied and Computational Harmonic Analysis, 62, 441-452, 2023. (doi)
S. Foucart, E. Tadmor, M. Zhong. On the sparsity of LASSO minimizers in sparse data recovery. Constructive Approximation, 57, 901-919, 2023. (doi)
V. K. Amalladinne, J. R. Ebert, J. F. Chamberland, K. Narayanan. An enhanced decoding algorithm for coded compressed sensing with applications to unsourced random access. Sensors, 22/2, 676, 2022. (doi)
S. Foucart, D. Needell, R. Pathak, Y. Plan, M. Wootters. Weighted matrix completion from non-random, non-uniform sampling patterns. IEEE Transactions on Information Theory, 67/2, 1264-1290, 2021. (doi)
S. Foucart. Facilitating OWL norm minimizations. Optimization Letters, 15/1, 263-269, 2021. (doi)