Short bio
I am a computer scientist specializing in machine learning and dynamical systems. I received my B.Sc. (2007) and M.Sc. (2010) in Computer Science, and a Ph.D. (2013) in Information Engineering, Electronics, and Telecommunications from Sapienza University of Rome. During my doctoral studies, I also gained extensive experience in the ICT industry.
Following my Ph.D., I held postdoctoral positions at Ryerson University (Canada), Politecnico di Milano (Italy), and Università della Svizzera Italiana (Switzerland). From 2016 to 2019, I was Assistant Professor at the University of Exeter (UK). From 2019 to October 2023, I served as Associate Professor and Tier 2 Canada Research Chair in the Department of Computer Science at the University of Manitoba (Canada).
I am currently Professor and Dean at the Open Institute of Technology (OPIT), where I lead academic and research strategy.
Research interests
My research lies at the intersection of machine learning and complex dynamical systems, with the overarching goal of developing a principled dynamical theory of learning. Over the past decade, my work has progressed from graph-based machine learning and nonlinear time series analysis toward a unified framework in which deep neural networks are studied as coupled state–parameter dynamical systems.
My early research focused on machine learning for non-Euclidean and geometrically structured data, particularly graphs and hypergraphs. I developed theoretical and methodological tools for analyzing structured input spaces and sequences of attributed graphs, including temporal and time-varying networks. In parallel, I investigated real-world complex dynamical systems using recurrence analysis, multifractal methods, and network representations derived from time series.
More recently, my work has concentrated on the dynamical foundations of deep learning, with a particular emphasis on recurrent neural networks and gated architectures. I study how gating mechanisms induce neuron- and lag-specific time scales that couple state dynamics to parameter updates, giving rise to effective learning rates that govern temporal credit assignment. This perspective has led to a trilogy of works developing:
(i) the dynamical mechanism underlying effective learning rates,
(ii) a finite-sample theory of temporal learnability via learnability windows under heavy-tailed gradient noise, and
(iii) a stochastic dynamical explanation of how broad or multimodal time-scale spectra emerge during training through anti-collapse dynamics.
Together, these contributions establish a coherent framework in which learning capacity, stability, and forgetting are interpreted as properties of interacting time scales in coupled fast–slow systems. My current research program aims to extend this theory beyond recurrence to deep feedforward, state-space, and attention-based models, and to formalize task formation and catastrophic forgetting as attractor phenomena and bifurcations in nonautonomous dynamical systems.
Alongside these theoretical developments, I remain strongly engaged in applications involving complex physical and biological systems. My previous work includes graph-based modeling of protein structure–function relationships, energy and information diffusion in biomolecular systems, and the analysis of molecular dynamics simulations via graph and hypergraph representations. I have also contributed to biomedical signal processing, developing predictive models for atrial fibrillation and myocardial infarction from ECG data, and machine learning methods for seizure prediction and localization from EEG signals. More broadly, I am interested in applying dynamical learning frameworks to scientific domains characterized by multiscale temporal structure, including health, climate systems, materials science, and interacting particle systems.
Scientific activities and services
IEEE
IEEE senior member
Editorial board
Senior Editor IEEE Transactions on Neural Networks and Learning Systems
Applied Soft Computing, Elsevier
Reviewer
- Journals
Philosophical Transactions of the Royal Society B
IEEE Transactions on Neural Networks and Learning Systems
IEEE Transactions on Fuzzy Systems
IEEE Computational Intelligence Magazine
Information Sciences, Elsevier
Neural Networks, Elsevier
Applied Soft Computing, Elsevier
Fuzzy Sets and Systems, Elsevier
Engineering Applications of Artificial Intelligence, Elsevier
Neurocomputing, Elsevier
Soft Computing, Springer
Journal of Intelligent Manufacturing, Springer
Granular Computing, Springer
International Journal of Machine Learning and Cybernetics, Springer
IEEE-Access
Contemporary Mathematics of the AMS
- Conferences
ICML, NIPS, IEEE-IJCNN, SCIA, IEEE-MLSP, IEEE-ICASSP