Publications
Publications
Information-theoretic reduction of deep neural networks to linear models in the overparametrized proportional regime, Francesco Camilli, Daria Tieplova, Eleonora Bergamin, Jean Barbier, 38th Conference on Learning Theory (COLT 2025)
Information-theoretic reduction of deep neural networks to linear models in the overparametrized proportional regime, Francesco Camilli, Daria Tieplova, Eleonora Bergamin, Jean Barbier, 38th Conference on Learning Theory (COLT 2025)
Statistical mechanics of extensive-width Bayesian neural networks near interpolation, Jean Barbier, Francesco Camilli, Minh-Toan Nguyen, Mauro Pastore, Rudy Skerk, (2025), https://doi.org/10.48550/arXiv.2501.18530
Statistical mechanics of extensive-width Bayesian neural networks near interpolation, Jean Barbier, Francesco Camilli, Minh-Toan Nguyen, Mauro Pastore, Rudy Skerk, (2025), https://doi.org/10.48550/arXiv.2501.18530
Phase diagram of extensive-rank symmetric matrix denoising beyond rotational invariance, Jean Barbier, Francesco Camilli, Justin Ko, Koki Okajima (2024), Physical Review X, https://doi.org/10.48550/arXiv.2411.01974
Phase diagram of extensive-rank symmetric matrix denoising beyond rotational invariance, Jean Barbier, Francesco Camilli, Justin Ko, Koki Okajima (2024), Physical Review X, https://doi.org/10.48550/arXiv.2411.01974
See a preview here.
Information limits and Thouless-Anderson-Palmer equations for spiked matrix models with structured noise, Jean Barbier, Francesco Camilli, Marco Mondelli, Yizhou Xu, Physical Review Research 7 (2025), https://doi.org/10.1103/PhysRevResearch.7.013081
Information limits and Thouless-Anderson-Palmer equations for spiked matrix models with structured noise, Jean Barbier, Francesco Camilli, Marco Mondelli, Yizhou Xu, Physical Review Research 7 (2025), https://doi.org/10.1103/PhysRevResearch.7.013081
A multiscale cavity method for sublinear-rank matrix factorixaton, Jean Barbier, Justin Ko, Anas Rahman, (2024), https://doi.org/10.48550/arXiv.2403.07189
A multiscale cavity method for sublinear-rank matrix factorixaton, Jean Barbier, Justin Ko, Anas Rahman, (2024), https://doi.org/10.48550/arXiv.2403.07189
Matrix inference in growing rank regimes, Farzad Pourkamali, Jean Barbier, Nicolas Macris, IEEE Transactions on Information Theory (2024), https://doi.org/10.48550/arXiv.2306.01412
Matrix inference in growing rank regimes, Farzad Pourkamali, Jean Barbier, Nicolas Macris, IEEE Transactions on Information Theory (2024), https://doi.org/10.48550/arXiv.2306.01412
Fundamental limits of overparametrized shallow neural networks for supervised learning, Francesco Camilli, Daria Tieplova, Jean Barbier, (2023), https://doi.org/10.48550/arXiv.2307.05635
Fundamental limits of overparametrized shallow neural networks for supervised learning, Francesco Camilli, Daria Tieplova, Jean Barbier, (2023), https://doi.org/10.48550/arXiv.2307.05635
See a talk by Francesco about this work here
Fundamental limits in structured principal component analysis and how to reach them, Jean Barbier, Francesco Camilli, Marco Mondelli, Manuel Sáenz, Proceedings of the National Academy of Sciences (PNAS) 120 (30) e2302028120 (2023), https://doi.org/10.1073/pnas.230202812
Fundamental limits in structured principal component analysis and how to reach them, Jean Barbier, Francesco Camilli, Marco Mondelli, Manuel Sáenz, Proceedings of the National Academy of Sciences (PNAS) 120 (30) e2302028120 (2023), https://doi.org/10.1073/pnas.230202812