Publications
Preprints
B. Adcock, S. Brugiapaglia, N. Dexter and S. Moraga. Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks. Submitted, 2024. [arXiv]
E. Zangrando, P. Deidda, S. Brugiapaglia, N. Guglielmi and F. Tudisco. Neural rank collapse: weight decay and small within-class variability yield low-rank bias. Submitted, 2024. [arXiv]
N.R. Franco and S. Brugiapaglia. A practical existence theorem for reduced order models based on convolutional autoencoders. Submitted, 2024. [arXiv]
G.A. D'Inverno, S. Brugiapaglia, and M. Ravanelli. Generalization limits of graph neural networks in identity effects learning. Submitted, 2023. [arXiv] [GitHub]
S. M.-Taheri and S. Brugiapaglia. The greedy side of the LASSO: New algorithms for weighted sparse recovery via loss function-based orthogonal matching pursuit. Submitted, 2023. [arXiv] [GitHub]
B. Adcock, S. Brugiapaglia, N. Dexter, and S. Moraga. Near-optimal learning of Banach-valued, high-dimensional functions via deep neural networks. Submitted, 2022. [arXiv]
B. Adcock and S. Brugiapaglia. Monte Carlo is a good sampling strategy for polynomial approximation in high dimensions. Submitted, 2022. [arXiv] [GitHub]
Publications
Books
1. B. Adcock, S. Brugiapaglia, and C. G. Webster. Sparse Polynomial Approximation of High-Dimensional Functions. Society for Industrial and Applied Mathematics, 2022. [Companion Website]
Book chapters
2. S. Brugiapaglia. A compressive spectral collocation method for the diffusion equation under the restricted isometry property. In “Quantification of Uncertainty: Improving Efficiency and Technology”, series “Lecture Notes in Computational Science and Engineering”, Vol. 137, pp. 15-40, Springer, Cham, 2020. [DOI] [arXiv] [GitHub]
1. B. Adcock, S. Brugiapaglia, and C.G. Webster. Compressed sensing approaches for polynomial approximation of high-dimensional functions. In "Compressed Sensing and its Applications", series Applied and Numerical Harmonic Analysis, pp 93-124, Springer, Cham, 2018. [DOI] [arXiv]
Journal articles
17. A. Berk, S. Brugiapaglia, and T. Hoheisel. Square Root LASSO: well-posedness, Lipschitz stability and the tuning trade off. SIAM Journal on Optimization, accepted, 2024. [arXiv] [GitHub]
16. B. Adcock, S. Brugiapaglia, N. Dexter, and S. Moraga. On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples. Memoirs of the European Mathematical Society, accepted, 2023. [arXiv]
15. W. Wang and S. Brugiapaglia. Compressive Fourier collocation methods for high-dimensional diffusion equations with periodic boundary conditions. IMA Journal of Numerical Analysis, Accepted, 2023. [DOI] [arXiv] [GitHub]
14. A. Berk, S. Brugiapaglia, and T. Hoheisel. LASSO reloaded: a variational analysis perspective with applications to compressed sensing. SIAM Journal on Mathematics of Data Science, 5(4), pp. 1102-1129, 2023. [DOI] [arXiv]
13. A. Berk, S. Brugiapaglia, B. Joshi, Y. Plan, M. Scott, and Ö. Yilmaz. A coherence parameter characterizing generative compressed sensing with Fourier measurements. IEEE Journal on Selected Areas in Information Theory, 3(3), pp. 502-512, Special Issue on Deep Learning Methods for Inverse Problems, 2022. [DOI] [arXiv] [GitHub]
12. S. Brugiapaglia, M. Liu, and P. Tupper. Invariance, encodings, and generalization: learning identity effects with neural networks. Neural Computation, 34 (8), pp. 1756-1789, 2022. [DOI] [arXiv] [GitHub]
11. B. Adcock, S. Brugiapaglia, and M. King-Roskamp. Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing. Foundations of Computational Mathematics, 22, pp. 99–159, 2022. [DOI] [arXiv]
10. B. Adcock, S. Brugiapaglia, and M. King-Roskamp. The benefits of acting locally: Reconstruction algorithms for sparse in levels signals with stable and robust recovery guarantees. IEEE Transactions on Signal Processing, vol. 69, pp. 3160-3175, 2021. [DOI] [arXiv]
9. S. Brugiapaglia, S. Dirksen, H.C. Jung, and H. Rauhut. Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs. Applied and Computational Harmonic Analysis, 53, pp. 231-269, 2021. [DOI] [arXiv]
8. S. Brugiapaglia, S. Micheletti, F. Nobile, and S. Perotto. Wavelet-Fourier CORSING techniques for multi-dimensional advection-diffusion-reaction equations. IMA Journal of Numerical Analysis, 41(4), pp. 2744-2781, 2021. [DOI] [arXiv] [Supplementary material] [GitHub]
7. B. Adcock, C. Boyer, and S. Brugiapaglia. On oracle-type local recovery guarantees in compressed sensing. Information and Inference: A Journal of the IMA, 10 (1), pp. 1-49, 2021. [DOI] [arXiv] [GitHub]
6. S. Brugiapaglia, L. Tamellini, and M. Tani. Compressive Isogeometric Analysis. Computers & Mathematics with Applications, 80 (12), pp. 3137-3155, 2020. [DOI] [arXiv]
5. B. Adcock, A. Bao, and S. Brugiapaglia. Correcting for unknown errors in sparse high-dimensional function approximation. Numerische Mathematik, 142(3), pp. 667-711, 2019. [DOI] [arXiv]
4. S. Brugiapaglia, and B. Adcock. Robustness to Unknown Error in Sparse Regularization. IEEE Transactions on Information Theory, 64 (10), pp. 6638-6661, 2018. [DOI] [arXiv]
3. S. Brugiapaglia, F. Nobile, S. Micheletti, and S. Perotto. A theoretical study of COmpRessed SolvING for advection-diffusion-reaction problems. Mathematics of Computation 87 (309), pp. 1-38, 2018. [DOI] [ResearchGate]
2. S. Brugiapaglia, S. Micheletti, and S. Perotto. Compressed solving: A numerical approximation technique for elliptic PDEs based on Compressed Sensing. Computers & Mathematics with Applications, 70 (6), pp. 1306–1335, 2015. [DOI] [ResearchGate]
1. S. Brugiapaglia, and L. Gemignani. On the simultaneous refinement of the zeros of H-palindromic polynomials. Journal of Computational and Applied Mathematics, 272, pp. 293–303, 2014. [DOI] [ResearchGate]
Conference proceedings (refereed)
7. A. Berk, S. Brugiapaglia, Y. Plan, M. Scott, X. Sheng, and O. Yilmaz. Model-adapted Fourier sampling for generative compressed sensing. Accepted to the proceedings of the NeurIPS 2023 workshop "Deep Learning and Inverse Problems". New Orleans, LA, US, 2023. [Published version] [arXiv]
6. N. Dexter, S. Moraga, S. Brugiapaglia and B. Adcock. Effective deep neural network architectures for learning high-dimensional Banach-valued functions from limited data. Proceedings of the 8th International Conference on Computational Harmonic Analysis 2022 (ICCHA2022), Ingolstadt, Germany, 2022. [Published version]
5. B. Adcock, S. Brugiapaglia, N. Dexter, and S. Moraga. Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data. Proceedings of Machine Learning Research vol 145:1–36, 2022, 2nd Annual Conference on Mathematical and Scientific Machine Learning. [Published version] [arXiv]
4. B. Adcock, S. Brugiapaglia, N. Dexter, and S. Moraga. Learning High-Dimensional Hilbert-Valued Functions With Deep Neural Networks From Limited Data. Proceedings of the AAAI 2021 Spring Symposium on Combining Artificial Intelligence and Machine Learning with Physical Sciences, Stanford, CA, US, 2021. [PDF]
3. S. Brugiapaglia, M. Liu, and P. Tupper. Generalizing Outside the Training Set: When Can Neural Networks Learn Identity Effects? Proceedings of CogSci 2020. [Published version] [arXiv] [GitHub]
2. B. Adcock and S. Brugiapaglia. Sparse approximation of multivariate functions from small datasets via weighted orthogonal matching pursuit. In: Sherwin, S.J., Moxey, D., Peiró, J., Vincent, P.E., Schwab, C. (eds) Spectral and High Order Methods for Partial Differential Equations ICOSAHOM 2018. Lecture Notes in Computational Science and Engineering, vol 134. Springer, Cham, 2020. [DOI] [arXiv]
1. S. Brugiapaglia, B. Adcock, and R.K. Archibald. Recovery guarantees for compressed sensing with unknown error. Proceedings of the 12th International Conference "Sampling Theory and Applications" (SampTA). Tallinn, Estonia, 2017. [DOI] [arXiv]
Conference proceedings (non-refereed)
1. B. Adcock, S. Brugiapaglia, and M. King-Roskamp. Iterative and greedy algorithms for the sparsity in levels model in compressed sensing. Proceedings of the Conference "SPIE Optical Engineering + Applications", San Diego, California, US, 2019. [DOI] [arXiv]
Theses
3. COmpRessed SolvING: Sparse Approximation of PDEs based on Compressed Sensing. Ph.D. thesis, Politecnico di Milano, 2016. (Advisors: S. Perotto and S. Micheletti) [Published version]
2. Problemi non lineari agli autovalori per l'analisi della stabilità di equazioni differenziali con ritardo. M.Sc. thesis, University of Pisa, 2012. (Advisor: L. Gemignani) [Academia.edu]
1. Gli schemi di suddivisione: analisi della convergenza nel caso univariato stazionario. B.Sc. thesis, University of Pisa, 2010. Advisor: D. Bini. [Academia.edu]