In recent years, I have developed unsupervised machine learning (ML) models for analyzing spectra of proteins in water solutions at different temperatures and concentrations. The spectral data come from various spectroscopic techniques: UV resonance Raman spectroscopy, Circular Dichroism, and UV absorbance performed at the Elettra Sincrotrone facility in Trieste, Italy. It goes without saying that these data are high-dimensional (with thousands of features) and noisy. Therefore, inferring information about the structural changes of proteins requires ML methods such as similarity metrics and manifold reduction algorithms, such as principal component analysis (PCA). One of the key points of this research is taming the noise in the data and overcoming the limitations of PCA. Since PCA is linear, it cannot be used to visualize high-dimensional data that lie on non-linear manifolds.
The theory of probability, especially from a Bayesian perspective, is the focus of my investigations as it naturally emerges in many fields, such as complex systems and cognitive science, where a probabilistic framework is crucial for modeling learning. Furthermore, the theory of probability underpins many machine learning algorithms, such as Gaussian processes, where the Bayesian approach proves to be exceptionally powerful. One example of a Bayesian modeling of word learning in language dynamics is the Bayesian Naming Game which replaced the name learning with a human-like word learning within a probabilistic framework proposed by MIT professor Josh Tenenbaum (see Figure below).
I am also interested to apply ML to physical/chemical problems, e.g. representations of systems, commonly referred to as descriptors. Recently together with Alessandro Romualdi, we developed a regression model based on convolutional neural networks for accurately predicting the scattering S-wave shifts of various short-range potentials, e.g. Yukawa (or Thomas-Fermi potential), see relative publication at Eur. Phys. J. B 94 (2021). Recently I am interested in some aspects of Deep Learning.
I investigate in various subjects of condensed matter theory: spin transport and decoherence in semiconductors, electron-electron scattering in crystalline solids, screening of electron-electron interaction and theory of Thomas-Fermi potential. Solid state physics is characterized by Anderson's motto "more is different". Despite the fact that it was called "squalid state physics" by Gell-Mann and Schmutzphysik (dirty physics) by Pauli, it is a beautiful rich research field that offers the possibility of comparing theory with experiments. My two recent works are about the Thomas-Fermi theory , the first density functional theory (DFT), in three-dimensional condensed matter systems and how the electron-electron scattering affects the spin decoherence in GaAs semiconductor.
One of main interests is the many-body physics through Green functions as I wish to model electron-electron and electron-phonon scattering going beyond some standard approximation, for instance, the random phase approximation (RPA).
Spin transport in n-doped GaAs from ensemble Monte Carlo simulations.
Ergodicity in classical statistical mechanics is another topic in which I am interested. In particular, I am trying to understand the dynamics of Fermi-Pasta-Ulam-Tsingou model through the methods to topological data analysis (TDA), e.g. through persistent homology.
Time-evolution of some energy modes in FPUT model with N=32 oscillators through my computer simulations on computing cluster, assuming only the first mode excited.
Navier-Stokes equations. I worked with Navier-Stokes equations applied to a problem involving a Newtonian viscous fluid during my master thesis. I am still interested in understanding these classical partial differential equations from both physical and mathematical points of view.