# Publications

# Preprints

8. S. Brugiapaglia, R. Chiclana, T. Hoheisel and M. Iwen. On Continuous Terminal Embeddings of Sets of Positive Reach. Submitted, 2024. [arXiv]

7. S. Brugiapaglia, N. Dexter, S. Karam and W. Wang. Physics-informed deep learning and compressive collocation for high-dimensional diffusion-reaction equations: practical existence theory and numerics. Submitted, 2024 [arXiv]

6. M. Mignacca, S. Brugiapaglia and J.J. Bramburger. Real-time motion detection using dynamic mode decomposition. Submitted, 2024. [arXiv]

5. E. Zangrando, P. Deidda, S. Brugiapaglia, N. Guglielmi and F. Tudisco. Neural rank collapse: weight decay and small within-class variability yield low-rank bias. Submitted, 2024. [arXiv]

4. G.A. D'Inverno, S. Brugiapaglia and M. Ravanelli. Generalization limits of graph neural networks in identity effects learning. Submitted, 2023. [arXiv] [GitHub]

3. S. M.-Taheri and S. Brugiapaglia. The greedy side of the LASSO: New algorithms for weighted sparse recovery via loss function-based orthogonal matching pursuit. Submitted, 2023. [arXiv] [GitHub]

2. B. Adcock, S. Brugiapaglia, N. Dexter and S. Moraga. Near-optimal learning of Banach-valued, high-dimensional functions via deep neural networks. Submitted, 2022. [arXiv]

1. B. Adcock and S. Brugiapaglia. Monte Carlo is a good sampling strategy for polynomial approximation in high dimensions. Submitted, 2022. [arXiv] [GitHub]

# Publications

## Books

B. Adcock, S. Brugiapaglia, and C. G. Webster.

Sparse Polynomial Approximation of High-Dimensional Functions

Society for Industrial and Applied Mathematics, 2022.

[DOI] [Companion Website] [Google Books] [GitHub]

## Book chapters

3. B. Adcock, S. Brugiapaglia, N. Dexter and S. Moraga. Learning smooth functions in high dimensions: from sparse polynomials to deep neural networks. In "Numerical Analysis meets Machine Learning", series Handbook of Numerical Analysis, Vol. 25, pp. 1-52, 2024. [DOI] [arXiv]

2. S. Brugiapaglia. A compressive spectral collocation method for the diffusion equation under the restricted isometry property. In “Quantification of Uncertainty: Improving Efficiency and Technology”, series Lecture Notes in Computational Science and Engineering, Vol. 137, pp. 15-40, Springer, Cham, 2020. [DOI] [arXiv] [GitHub]

1. B. Adcock, S. Brugiapaglia, and C.G. Webster. Compressed sensing approaches for polynomial approximation of high-dimensional functions. In "Compressed Sensing and its Applications", series Applied and Numerical Harmonic Analysis, pp 93-124, Springer, Cham, 2018. [DOI] [arXiv]

## Journal articles

17. N.R. Franco and S. Brugiapaglia. A practical existence theorem for reduced order models based on convolutional autoencoders. Foundations of Data Science, in press, 2024. [DOI] [arXiv]

16. A. Berk, S. Brugiapaglia, and T. Hoheisel. Square Root LASSO: well-posedness, Lipschitz stability and the tuning trade off. SIAM Journal on Optimization, in press, 2024. [DOI] [arXiv] [GitHub]

15. W. Wang and S. Brugiapaglia. Compressive Fourier collocation methods for high-dimensional diffusion equations with periodic boundary conditions. IMA Journal of Numerical Analysis, in press, 2023. [DOI] [arXiv] [GitHub]

14. A. Berk, S. Brugiapaglia, and T. Hoheisel. LASSO reloaded: a variational analysis perspective with applications to compressed sensing. SIAM Journal on Mathematics of Data Science, 5(4), pp. 1102-1129, 2023. [DOI] [arXiv]

13. A. Berk, S. Brugiapaglia, B. Joshi, Y. Plan, M. Scott, and Ö. Yilmaz. A coherence parameter characterizing generative compressed sensing with Fourier measurements. IEEE Journal on Selected Areas in Information Theory, 3(3), pp. 502-512, Special Issue on Deep Learning Methods for Inverse Problems, 2022. [DOI] [arXiv] [GitHub]

12. S. Brugiapaglia, M. Liu, and P. Tupper. Invariance, encodings, and generalization: learning identity effects with neural networks. Neural Computation, 34 (8), pp. 1756-1789, 2022. [DOI] [arXiv] [GitHub]

11. B. Adcock, S. Brugiapaglia, and M. King-Roskamp. Do log factors matter? On optimal wavelet approximation and the foundations of compressed sensing. Foundations of Computational Mathematics, 22, pp. 99–159, 2022. [DOI] [arXiv]

10. B. Adcock, S. Brugiapaglia, and M. King-Roskamp. The benefits of acting locally: Reconstruction algorithms for sparse in levels signals with stable and robust recovery guarantees. IEEE Transactions on Signal Processing, vol. 69, pp. 3160-3175, 2021. [DOI] [arXiv]

9. S. Brugiapaglia, S. Dirksen, H.C. Jung, and H. Rauhut. Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs. Applied and Computational Harmonic Analysis, 53, pp. 231-269, 2021. [DOI] [arXiv]

8. S. Brugiapaglia, S. Micheletti, F. Nobile, and S. Perotto. Wavelet-Fourier CORSING techniques for multi-dimensional advection-diffusion-reaction equations. IMA Journal of Numerical Analysis, 41(4), pp. 2744-2781, 2021. [DOI] [arXiv] [Supplementary material] [GitHub]

7. B. Adcock, C. Boyer, and S. Brugiapaglia. On oracle-type local recovery guarantees in compressed sensing. Information and Inference: A Journal of the IMA, 10 (1), pp. 1-49, 2021. [DOI] [arXiv] [GitHub]

6. S. Brugiapaglia, L. Tamellini, and M. Tani. Compressive Isogeometric Analysis. Computers & Mathematics with Applications, 80 (12), pp. 3137-3155, 2020. [DOI] [arXiv]

5. B. Adcock, A. Bao, and S. Brugiapaglia. Correcting for unknown errors in sparse high-dimensional function approximation. Numerische Mathematik, 142(3), pp. 667-711, 2019. [DOI] [arXiv]

4. S. Brugiapaglia, and B. Adcock. Robustness to Unknown Error in Sparse Regularization. IEEE Transactions on Information Theory, 64 (10), pp. 6638-6661, 2018. [DOI] [arXiv]

3. S. Brugiapaglia, F. Nobile, S. Micheletti, and S. Perotto. A theoretical study of COmpRessed SolvING for advection-diffusion-reaction problems. Mathematics of Computation 87 (309), pp. 1-38, 2018. [DOI] [ResearchGate]

2. S. Brugiapaglia, S. Micheletti, and S. Perotto. Compressed solving: A numerical approximation technique for elliptic PDEs based on Compressed Sensing. Computers & Mathematics with Applications, 70 (6), pp. 1306–1335, 2015. [DOI] [ResearchGate]

1. S. Brugiapaglia, and L. Gemignani. On the simultaneous refinement of the zeros of H-palindromic polynomials. Journal of Computational and Applied Mathematics, 272, pp. 293–303, 2014. [DOI] [ResearchGate]

## Conference proceedings (refereed)

8. S. M.-Taheri, M. Colbrook, S. Brugiapaglia. OMP-Net: Neural network unrolling of weighted Orthogonal Matching Pursuit. Proceedings of the 6th International Workshop on the Theory of Computational Sensing and its applications to Radar, Multimodal Sensing, and Imaging (CoSeRa 2024), Santiago de Compostela, Spain. Accepted, 2024.

7. A. Berk, S. Brugiapaglia, Y. Plan, M. Scott, X. Sheng, and O. Yilmaz. Model-adapted Fourier sampling for generative compressed sensing. Proceedings of the NeurIPS 2023 workshop "Deep Learning and Inverse Problems" (Oral Presentation). New Orleans, LA, US, 2023. [Published version] [arXiv]

6. N. Dexter, S. Moraga, S. Brugiapaglia and B. Adcock. Effective deep neural network architectures for learning high-dimensional Banach-valued functions from limited data. Proceedings of the 8th International Conference on Computational Harmonic Analysis 2022 (ICCHA2022), Ingolstadt, Germany, 2022. [Published version]

5. B. Adcock, S. Brugiapaglia, N. Dexter, and S. Moraga. Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data. Proceedings of Machine Learning Research vol 145:1–36, 2022, 2nd Annual Conference on Mathematical and Scientific Machine Learning. [Published version] [arXiv]

4. B. Adcock, S. Brugiapaglia, N. Dexter, and S. Moraga. Learning High-Dimensional Hilbert-Valued Functions With Deep Neural Networks From Limited Data. Proceedings of the AAAI 2021 Spring Symposium on Combining Artificial Intelligence and Machine Learning with Physical Sciences, Stanford, CA, US, 2021. [PDF]

3. S. Brugiapaglia, M. Liu, and P. Tupper. Generalizing Outside the Training Set: When Can Neural Networks Learn Identity Effects? Proceedings of CogSci 2020. [Published version] [arXiv] [GitHub]

2. B. Adcock and S. Brugiapaglia. Sparse approximation of multivariate functions from small datasets via weighted orthogonal matching pursuit. In: Sherwin, S.J., Moxey, D., Peiró, J., Vincent, P.E., Schwab, C. (eds) Spectral and High Order Methods for Partial Differential Equations ICOSAHOM 2018. Lecture Notes in Computational Science and Engineering, vol 134. Springer, Cham, 2020. [DOI] [arXiv]

1. S. Brugiapaglia, B. Adcock, and R.K. Archibald. Recovery guarantees for compressed sensing with unknown error. Proceedings of the 12th International Conference "Sampling Theory and Applications" (SampTA). Tallinn, Estonia, 2017. [DOI] [arXiv]

## Conference proceedings (non-refereed)

1. B. Adcock, S. Brugiapaglia, and M. King-Roskamp. Iterative and greedy algorithms for the sparsity in levels model in compressed sensing. Proceedings of the Conference "SPIE Optical Engineering + Applications", San Diego, California, US, 2019. [DOI] [arXiv]

## Theses

3. COmpRessed SolvING: Sparse Approximation of PDEs based on Compressed Sensing. Ph.D. thesis, Politecnico di Milano, 2016. (Advisors: S. Perotto and S. Micheletti) [Published version]

2. Problemi non lineari agli autovalori per l'analisi della stabilità di equazioni differenziali con ritardo. M.Sc. thesis, University of Pisa, 2012. (Advisor: L. Gemignani) [Academia.edu]

1. Gli schemi di suddivisione: analisi della convergenza nel caso univariato stazionario. B.Sc. thesis, University of Pisa, 2010. Advisor: D. Bini. [Academia.edu]