PhD - Postdoctoral Researcher
Institute of Neural Information Processing, Center for Molecular Neurobiology Hamburg (ZMNH), University Medical Center Hamburg-Eppendorf (UKE), Hamburg
I study how long-term memory is learned, maintained, consolidated and processed in artificial neural networks. More specifically, I focus on the role of inhibitory plasticity in these processes. I explore these problems in different models such as phase oscillators, spiking neural networks and mean-field models. The aim of my research is both to gain a better understanding of the biological mechanisms that govern the brain and to integrate certain neural principles into artificial intelligence models.
My two main current research areas are on: Sustainable memory learning and Memory storage and processing.
Alumni of:
Neurocybernetics team, ETIS lab, CY Cergy Paris University, Cergy
Computational Neuroscience Group, Center for Brain and Cognition, Pompeu Fabra University, Barcelona
Artificial neural networks
Spiking neurons
Coupled oscillators
Computational neuroscience
Structure in neural networks
Learning in neural networks
Memory
Multimodal learning and processing
Inhibition
Synchronization
September 11, 2025 - Attendance to the "Hamburg neuro talks", Hamburg, Germany
September 4, 2025 - Attendance to the ZMNH retreat, Hamburg, Germany
June 23-27, 2025 - XLV Dynamics Days Europe, Thessaloniki, Greece
Talk for the minisymposium: Neural Dynamics Across Spatiotemporal Scales: Models, Learning Processes, Computational Tools, and Clinical Applications, entitled "Coherent and rich neuronal dynamics supported by different inhibition mechanisms".
April 22, 2025 - Publication of a journal paper in PLoS Computational Biology
R. Bergoin, A. Torcini, G. Deco, M. Quoy, G. Zamora-López. Emergence and maintenance of modularity in neural networks with Hebbian and anti-Hebbian inhibitory STDP. PLoS Comput Biol 21(4): e1012973, 2025.
November 5-16, 2024 - Attendance to the workshop SNUFA 2024, Online.
October 15-17, 2024 - Attendance to the workshop "Introduction to Recurrent Neural Networks and their Applications" from the Center for Biomedical AI at the University Medical Center Hamburg-Eppendorf (UKE).
September 7, 2024 - Publication of a conference paper in From Animals to Animats: 17th International Conference on the Simulation of Adaptive Behavior (SAB 2024)
L. L’Haridon, R. Bergoin, B.S. Bal, M. Abdelwahed and L. Cañamero. The Emergence of a Complex Representation of Touch Through Interaction with a Robot. In: O. Brock, J. Krichmar (eds) From Animals to Animats 17. SAB 2024. Lecture Notes in Computer Science(), vol 14993. Springer, Cham.
September 3, 2024 - Publication of a journal paper in Nature Scientific Reports
R. Bergoin, S. Boucenna, R. D’Urso, D. Cohen and A. Pitti. A developmental model of audio-visual attention (MAVA) for bimodal language learning in infants and robots. Scientific Reports, 14, 20492, 2024.
Spontaneous recall of the learned items in the network dynamics at rest promotes the consolidation of associated structural clusters, and thus the long-term maintenance of memories.
The number of inhibitory neurons present in the network is correlated with the number of structural clusters that can be formed and stabilized over time, and therefore with the memory storage capacity of the network.
This spiking neural network is capable of recognizing and generating previously learned audio-visual digits in a multimodal scenario.
Neuronal populations oscillate according to their resonant frequencies.
Previously