# Mircea Petrache

**What's new:**

**What's new:**

**Organizing workshop on "****Theoretical and Mathematical aspects of Deep Learning****"**- 4 August 2022, at UC Chile (hybrid event)**Visiting****Augusto Gerolin****(Ottawa)**-- 16 July - 26 July 2022**Organizing summer school****Point configurations: deformations and rigidity**-- London 26 june - 2 July 2022"

**Classification of uniformly distributed measures of dimension****1****in general codimension"**(with P. Laurain) appeared in Asian. J. Math. https://www.intlpress.com/site/pub/pages/journals/items/ajm/content/vols/0025/0004/a006/New preprint

**"Almost sure recovery in quasi-periodic structures"**with Rodolfo Viera https://arxiv.org/abs/2112.11613Fondecyt Regular grant (4 years 2021-2025). Project title: "

**Rigidity, stability and uniformity for large point configurations**"New preprint "

**Sharp discrete isoperimetric inequalities in periodic graphs via discrete PDE and Semidiscrete Optimal Transport**" with Matias Gomez (master student UTFSM) https://arxiv.org/abs/2012.11039

(last updated: July 9, 2022)

**Contact: ***Facultad de Matemáticas, Avda. Vicuña Mackenna 4860, Macul, Santiago, 6904441, Chile*

**Contact:***Facultad de Matemáticas, Avda. Vicuña Mackenna 4860, Macul, Santiago, 6904441, Chile*

**Email: ***mpetrache@mat.uc.cl ***Cellphone: ***+56 9 3686 3545 * * Office phone:* 23544038

*Mathematics is there to interact with other sciences.*

*Mathematics is there to interact with other sciences.*

*I'm actively searching new ways to apply it in real-world problems. I will not work on a topic unless I actually believe that the gained knowledge can later be applied outside of mathematics.*

*I'm actively searching new ways to apply it in real-world problems. I will not work on a topic unless I actually believe that the gained knowledge can later be applied outside of mathematics.*

*I was trained in PDEs, Calculus of Variations, Geometric Analysis and Geometric Measure Theory. Which I now use to study emergent behavior and structures, especially (but not only) for large point configurations and deep learning.*

*I was trained in PDEs, Calculus of Variations, Geometric Analysis and Geometric Measure Theory. Which I now use to study emergent behavior and structures, especially (but not only) for large point configurations and deep learning.*

## CV

Since January 2018 I am Assistant Professor at PUC Chile.

September 2017-December 2017: visited FIM in Zürich and Vanderbilt University.

2015-17: MIS Max Planck Institute in Leipzig and Max Planck Institute in Bonn. (Funding: European Post-Doc Institute. Mentors: B. Kirchheim, S. Müller)

2013-2015: Postdoc at Laboratoire Jacques-Louis Lions (Funding: Fondation de Sciences Mathèmatiques de Paris. Mentor: Sylvia Serfaty)

2013: Ph.D at ETH Zürich, (Thesis: "*Weak bundle structures and a variational approach to Yang-Mills Lagrangians in supercritical dimensions*". Advisor: Tristan Rivière).

2008: MSc Scuola Normale Superiore, (Thesis: "*Differential Inclusions and the Euler equation*". Advisor: Luigi Ambrosio)

**Scientific themes I care about (storyline):**

**Scientific themes I care about (storyline):**

**Topological s****ingularities and large point configurations (my mathematical upbringing):**

In geometry and physics,

**vortices**or**topological defects**are formed and studied via regularity theory.Groups of defects form (in a setup including, but not restricted to "points = topological singularities")

**interacting point configurations**.These configurations often organize themselves in relation with

**the geometry/shape of the ambient space.**How to quantify "regularity" for discrete structures? We may look at

**distance/Fourier/spectral statistics of the points**, and quantify noise levels: we then go from**random**, to**hyperuniform**, to**regular/crystalline**configurations.**Persistence/recovery questions**of the above statistics, are closely related questions.Algorithmic complexity viewpoint: a limiting factor in working with general point configurations is their

**exponential complexity**(=the space of configurations grows in complexity exponentially in the number of points). And**high regularity = low complexity**(to describe points forming a regular lattice we need to fix a basis and an origin, while to describe a random point cloud we need to enumerate all the points one by one!)

**The nicest models of large point configurations we use, combine elegance & generality, with built-in exponential complexity.**

**The nicest models of large point configurations we use, combine elegance & generality, with built-in exponential complexity.**

**But if we stick with these models, we might miss that the ****data itself**** is actually much better behaved !**

**But if we stick with these models, we might miss that the**

**data itself****is actually much better behaved !**

**How to improve our way of thinking about (our classical models of) large groups of points in space,****to****make better sense real****dat****a points?**

### A powerful principle is to quantify the **group behavior** of the points, through **problem-specific** structures, that in turn instruct **low-complexity approximations.**** **Examples of what maths to use:

**group behavior**of the points, through

**problem-specific**structures, that in turn instruct

**low-complexity approximations.**

**Examples of what maths to use:**

Statistical mechanics measures group behavior and correlations within the theory of

**crystallization**and in the study of other**phases of matter.**In Material science in

**macroscopic/continuum limits**the properties of large numbers of atoms are summarized by (fewer) continuum/multiscale variables.In probability theory, one can start with

**large deviation principles, concentration of measure bounds**.In computer science, we have

**multi-scale analysis**,**metric dimension reduction**are studied (also cf.**compressed sensing**)**.**

**Similarly to the (green part) above, Deep Neural Networks are experimentally shown to "learn surprisingly well" real-life datasets, but we have no good theory.**

### What are we missing in order to understand this success? Some threads I'm following at the moment:

**Quantifying the information flow through Neural Networks**, especially models which are easily testable in "real data", in order to understand the Lottery Ticket Hypothesis.**How Neural Networks process geometric/topological features of the input**datasets (topics including, but not restricted to, Topological Data Analysis and Persistence Theory).**Geometry Informed Neural Networks**: why keep building neural networks based on imagining data in R^d or in L^2 norm? I'm thinking especially about ways to use Optimal Transport for DNN theory, and how to use Gromov Hyperbolicity to implement better learning algorithms.**Memory mechanisms in Neural Networks and in the brain**: what can Neuromorphic Neural Networks teach us about memory? What's their mathematical theory going to look like? This is related to understanding in a principled way the**role of recurrence and of time-dependencies, in deep learning**.

## Links to some past events (for personal reference):

**ICIAM****conference**, Valencia, Spain**, July 15-19, 2019,**-- talks in mini-symposia**"Discrepancy and Minimal Energy"**and**"Molecular simulation: quantum mechanical models"**(photo of myself with L. Betermin at https://iciam2019.org/images/site/FotosCongreso/CommonAreas/Session(4).jpg)*BIRS workshop January 27th - February 1st, 2019 on***Optimal Transport Methods in Density Functional Theory.****Workshop that I co-organized***, on November 30th, 2018, London:***Symmetries, asymptotics and multi-scale approaches**AIM workshop

**Discrete geometry and automorphic forms**, San Jose, California, September 24-28, 2018Workshop on

**Geometric Measure Theory in Verona**, June 11-15, 2018Workshop on

**Nonlocal interactions: Dislocations and beyond**, Bath, 11-14 June 2018ICERM trimester on

**Optimal and Random Point Configurations,**Brown University, Providence, February 26th - March 2nd, 2018**Workshop that I co-organized***, on February 15-17, 2017, London:***New trends in Mathematical Physics at the interface of Analysis and Probability**