Deep Network Approximation

[16] Y. Yang, Y. Wu, H. Yang*, Y. Xiang*. Nearly Optimal Approximation Rates for Deep Super ReLU Networks on Sobolev Spaces. Submitted. [pdf]

[15] Y. Yang, H. Yang*. Y. Xiang. Nearly Optimal VC-Dimension and Pseudo-Dimension Bounds for Deep Neural Network Derivatives. 37th Conference on Neural Information Processing Systems (NeurIPS 2023). [pdf] [doi]

[14] K. Chen^, C. Wang^, H. Yang^*. Deep Operator Learning Lessens the Curse of Dimensionality for PDEs. Transactions on Machine Learning Research, 2023. [pdf] [doi]

[13] Z. Shen^, H. Yang^, S. Zhang^*. Neural Network Architecture Beyond Width and Depth. 36th Conference on Neural Information Processing Systems (NeurIPS 2022).  [pdf] [doi]

[12] Z. Shen^, H. Yang^, S. Zhang^*. Deep Network Approximation:  Achieving Arbitrary Accuracy with  Fixed Number of Neurons. Journal of Machine Learning Research, 2022.  [pdf] [doi

[11] S. Liang^, L. Lyu^, C. Wang ^, H. Yang^*. Reproducing Activation Function for Deep Learning. Communication in Mathematical Sciences, 2024. [pdf] [doi]

[10] S. Hon^*, H. Yang^. Simultaneous Neural Network Approximations for Smooth Functions. Neural Networks, 2022.  [pdf] [doi]

[9] Z. Shen,  H. Yang, S. Zhang. Deep Network Approximation in Terms of Intrinsic Parameters. The 39th International Conference on Machine Learning (ICML 2022), Spotlight. [pdf] [doi]

[8] Z. Shen^*, H. Yang^, S. Zhang^. Optimal Approximation Rate of ReLU Networks in terms of Width and Depth. Journal de Mathématiques Pures et Appliquées, 2022. [pdf] [doi]

[7] J. Lu^, Z. Shen^, H. Yang^*, S. Zhang^. Deep Network Approximation for Smooth Functions. SIAM Journal on Mathematical Analysis, 2021. [pdf] [doi]

[6] Z. Shen^, H. Yang^*, S. Zhang^. Neural Network Approximation: Three Hidden Layers Are Enough. Neural Networks, 2021. [pdf] [doi]  

[5] Z. Shen^, H. Yang^*, S. Zhang^. Deep Network with Approximation Error Being Reciprocal of Width to Power of Square Root of Depth. Neural Computation, 2021. [pdf]  [doi]

[4] H. Montanelli, H. Yang*, Q. Du, Deep ReLU Networks Overcome the Curse of Dimensionality for Bandlimited Functions. Journal of Computational Mathematics, 2021. [pdf] [doi]

[2] Z. Shen^, H. Yang^*, S. Zhang^. Deep Network Approximation Characterized by Number of Neurons. Communications in Computational Physics, 2021. [pdf] [doi]

[2] H. Montanelli^*, H. Yang^, Error Bounds for Deep ReLU Networks using the Kolmogorov--Arnold Superposition Theorem. Neural Networks, 2020. [pdf] [doi]

[1] Z. Shen^, H. Yang^*, S. Zhang^. Nonlinear Approximation via Compositions. Neural Networks, Volume 119, November 2019, Pages 74-84. [pdf] [doi]


^: Equal contribution; *: Corresponding author.