Short bio
I am a computer scientist specializing in machine learning and dynamical systems. I received my B.Sc. (2007) and M.Sc. (2010) in Computer Science, and a Ph.D. (2013) in Information Engineering, Electronics, and Telecommunications from Sapienza University of Rome. During my doctoral studies, I also gained extensive experience in the ICT industry.
Following my Ph.D., I held postdoctoral research positions at Ryerson University (Canada), Politecnico di Milano (Italy), and the Università della Svizzera Italiana (Switzerland). From 2016 to 2019, I served as Assistant Professor at the University of Exeter (UK). I then joined the University of Manitoba (Canada), where I was Associate Professor in the Department of Computer Science and held a Tier 2 Canada Research Chair from 2019 to October 2023.
I am currently Professor and Academic Dean at the Open Institute of Technology (OPIT), where I lead academic and research strategy.
Research interests
My research lies at the intersection of machine learning and complex dynamical systems, with the overarching goal of developing a principled dynamical theory of learning. In this perspective, modern machine learning systems are viewed not merely as function approximators but as high-dimensional coupled dynamical systems in which model states, parameters, and optimization dynamics interact across multiple time scales. Understanding how these interactions shape learning, memory, and generalization is the central theme of my research program.
Over the course of my career, this program has evolved from earlier work on complex systems and structured machine learning. My initial research focused on machine learning methods for non-Euclidean and geometrically structured data, including graphs, hypergraphs, and temporal networks. I developed theoretical and algorithmic tools for modeling sequences of attributed graphs and analyzing dynamical processes on structured domains. In parallel, I studied real-world complex systems using recurrence analysis, multifractal techniques, and network representations derived from nonlinear time series, bridging ideas from dynamical systems theory and machine learning.
More recently, my research has concentrated on the dynamical foundations of deep learning, particularly in recurrent and gated neural architectures. I investigate how architectural mechanisms and optimization interact to produce heterogeneous time scales of information retention and learning. This work has introduced a framework in which gradient transport, stochastic optimization noise, and architectural structure jointly determine temporal credit assignment. Within this framework, neuron-wise effective learning rates emerge as dynamical quantities governing how information propagates across time during training, leading to a theory of temporal learnability and mechanisms for the emergence of broad time-scale spectra in trained networks.
These ideas form part of a broader research agenda aimed at characterizing learning as a multiscale dynamical process. My current work seeks to extend this perspective beyond recurrent architectures to deep feedforward, state-space, and attention-based models, and to develop mathematical tools for describing phenomena such as task formation, memory persistence, and catastrophic forgetting as dynamical transitions in nonautonomous systems. Ultimately, the goal is to build a unified theory connecting optimization, architecture, and stochasticity to the emergent dynamical regimes that determine what neural networks can learn and retain.
Alongside these theoretical developments, I remain strongly engaged in applications involving complex physical and biological systems, where multiscale temporal dynamics play a central role. My work has included graph-based modeling of protein structure–function relationships, the study of energy and information diffusion in biomolecular systems, and the analysis of molecular dynamics simulations through graph and hypergraph representations. I have also contributed to biomedical signal processing, developing machine learning methods for the prediction of cardiac events from ECG data and seizure detection from EEG signals. More broadly, I am interested in applying dynamical learning frameworks to scientific domains characterized by complex interacting processes across time and scale, including health systems, climate dynamics, materials science, and collective physical systems.
Scientific activities and services
IEEE
IEEE senior member
Editorial board
Senior Editor IEEE Transactions on Neural Networks and Learning Systems
Applied Soft Computing, Elsevier
Reviewer
- Journals
Philosophical Transactions of the Royal Society B
IEEE Transactions on Neural Networks and Learning Systems
IEEE Transactions on Fuzzy Systems
IEEE Computational Intelligence Magazine
Information Sciences, Elsevier
Neural Networks, Elsevier
Applied Soft Computing, Elsevier
Fuzzy Sets and Systems, Elsevier
Engineering Applications of Artificial Intelligence, Elsevier
Neurocomputing, Elsevier
Soft Computing, Springer
Journal of Intelligent Manufacturing, Springer
Granular Computing, Springer
International Journal of Machine Learning and Cybernetics, Springer
IEEE-Access
Contemporary Mathematics of the AMS
- Conferences
ICML, NIPS, IEEE-IJCNN, SCIA, IEEE-MLSP, IEEE-ICASSP