Exploit some a priori structure to learn/recover from few data - this incorporates Compressive Sensing, where the specific structure is vector sparsity
Handle other structures such as low rank for matrices and tensors, variable reduction for multivariate functions, etc.
Active/directed learning to address the choice of the few datasites
In particular, uncover near-optimal choices via deterministic (not random) means - this constitutes a major mathematical challenge
TAMU T3 Triads: Learning More Efficiently with Less Labels, 2020-22 (PI: Foucart, coPI: Tuo)
S. Foucart. The sparsity of LASSO-type minimizers. (arXiv)
S. Foucart, E. Tadmor, M. Zhong. On the sparsity of LASSO minimizers in sparse data recovery. Constructive Approximation, to appear. (arXiv)
V. K. Amalladinne, J. R. Ebert, J. F. Chamberland, K. Narayanan. An enhanced decoding algorithm for coded compressed sensing with applications to unsourced random access. Sensors, 22/2, 676, 2022. (doi)
S. Foucart, D. Needell, R. Pathak, Y. Plan, M. Wootters. Weighted matrix completion from non-random, non-uniform sampling patterns. IEEE Transactions on Information Theory, 67/2, 1264-1290, 2021. (doi)
S. Foucart. Facilitating OWL norm minimizations. Optimization Letters, 15/1, 263-269, 2021. (doi)