Pilot Project 2

Neural Network Approximation

Team: R. DeVore, K. Narayanan, G. Petrova, J. Siegel, S. Wojtowytsch (lead)

Synopsizing points

  • Understand the set of problems in data science/scientific computing where neural networks beat other methods of function approximation

  • Identify model classes for which approximation properties can be shown for various types of neural networks (ResNet, DenseNet, etc.)

  • Characterize the functions in a given approximation class and assess how well these classes represent realistic functions

  • Realize the approximation methods algorithmically and rigorously analyze approaches such as stochastic gradient and greedy methods

Related grants

  • NSF-CCF CIF: Small: Interpretable Machine Learning based on Deep Neural Networks: a Source Coding Perspective, 2022-2025 (local co-PI: Siegel)

  • NSF SCALE-MoDL: New Perspectives on Deep Learning: Bridging Approximation, Statistical, and Algorithmic Theories, 2021-24 (local PI: Petrova, local coPI: DeVore)

  • ONR MURI: Theoretical Foundations of Deep Learning, 2020-23 (local PI: DeVore, local coPIs: Foucart, Petrova)

Recent relevant papers

  • S. Wojtowytsch, J. Park. Qualitative neural network approximation over R and C: Elementary proofs for analytic and polynomial activation. (arXiv)

  • J. W. Siegel, J. Xu. Sharp bounds on the approximation rates, metric entropy, and n-widths of shallow neural networks. (arXiv)

  • I. Daubechies, R. DeVore, N. Dym, S. Faigenbaum-Golovin, S. Kovalsky, K.-C. Lin, J. Park, G. Petrova, B. Sober. Neural network approximation of refinable functions. IEEE Transactions on Information Theory, to appear. (arXiv)

  • J. W. Siegel, J. Xu. High-order approximation rates for shallow neural networks with cosine and ReLUk activation functions. Applied and Computational Harmonic Analysis, 58, 1-26, 2022. (doi)

  • J. W. Siegel, J. Xu. Optimal convergence rates for the orthogonal greedy algorithm. IEEE Transactions on Information Theory, 68/5, 3354-3361, 2022. (doi)

  • A. Cohen, R. DeVore, G. Petrova, P. Wojtaszczyk. Optimal stable nonlinear approximation. Foundations of Computational Mathematics, 22, 607–648, 2022. (doi)

  • I. Daubechies, R. DeVore, S. Foucart, B. Hanin, G. Petrova. Nonlinear approximation and (deep) ReLU networks. Constructive Approximation, 55, 127-172, 2022. (doi)

  • R. DeVore, B. Hanin, G. Petrova. Neural network approximation. Acta Numerica 30, 327-444, 2021. (doi)