Updates on my research and papers, discussion of open problems, and other physics-related topics. By Gionni Marchetti
17 April 2024. Recently, I proposed to model the time-dependent learning probabilities in the dynamical system associated with the Bayesian Naming Game (BNG) as logistic functions that are routinely used in neural networks and deep learning. In the paper Marchetti et al. Front. Phys. 8:10, 2020 where the original model was proposed, it is assumed that there is no time-dependence on the probabilities, and also they are equal, in order to determine the dynamical system's equilibria (or fixed points) and their equilibrium character. Despite the above assumption leads to the correct results, yielding to two fixed points, which are asymptotically stable, as confirmed by several multi-agent simulation, it cannot be used to prove these findings in full generality. In fact, if the solution starts from a particular initial condition, there is no convergence to the fixed points because the dynamical system under study lacks the necessary symmetry breaking process. This is similar to what happens in the original Naming Game as noted in Castellano et al. Rev. Mod. Phys, 81, 591 (2009). The numerical integration of the dynamical system's first order differential equation system with the proposed modeling remarkably yields the correct behaviour of the solution of the dynamical system as observed from multi-agent simulations with system size of ~ 200, where due to the noise, the symmetry breaking always occurs. This is illustrated in Figure below.
Solution's components x, y of the 2D dynamical system associated to BNG model as function of time t. (Inset) The learning probabilities are suitable logistic functions.
My preprint about the symmetry breaking and also non-generic bifurcation in classical naming game can be found on ArXiv
25 April 2024. In the last six months, I attended three Catalan courses provided by www.cpnl.cat I managed to pass the final examination (oral, audio and writing) in Catalan Language for Level A2. Below you can see my authentic certificate
16 May 2024. Today I opened Naming-Game-Models, a repository on my my git-hub account which will contain the updated scripts in Python 3.8 to simulate the Naming Game (NG) model and its many variants. During the last recent years while I was developing a new variant of NG based upon the Bayesian inference, I wrote many scripts for simulating the dynamics of its variants, including their time-evolution in different types of complex networks such as random graphs, graphs with small-world properties and scale-free networks. However, the scripts were written in Python 2.7 as this version was required for running the scripts in parallel on computer clusters. However, Python 2.7 reached the end of its life on January 1st, 2020, however until one year I had to make scripts in Python 2.7. For this reason, I am rewriting the previous scripts in Python 3.8, adding also some changes/improvement when required.
At the moment, you can find the script script ng2c.py for simulating the semiotic dynamics of an ensemble of agents in number N restricted to two names A and B corresponding to integers 0 and 1, respectively, starting from an initial configuration chosen by the user. Note that in such a model there will be always consensus at some time (the convergence time), that is, convergence to a state where all the agents have one name, either A or B, in their inventories. For such a simple model, except for the convergence time, there are two only observables: the total number of names and the success rate of the agents' pairwise interaction. The simulation outputs, including the convergence time and the maximum number of names are stored into file results.text.
27 May 2024. My manuscript Revisiting the Thomas-Fermi Potential for Three-Dimensional Condensed Matter Systems has been accepted for publication by European Physical Journal B (EPJB). You can find the preprint on arXiv
3 June 2024. Finally, my article "Generalized naming game and Bayesian naming game as dynamical systems" is published. Among the results, it shows that there exists a nongeneric bifurcation in the generalised naming game, a classical model of semiotic dynamics, and how such a bifurcation modifies the noise of the stochastic sample paths involved. In this regard, it is found that there are two possible stochastic processes: the Brownian motion and the Ornstein-Uhlenbeck process. Finally, it shows that the logistic function, widely used in Deep Learning, can be employed for modeling the word learning processes according to a Bayesian concept learning framework proposed by Josh Tenenbaum. As a result, such an approach introduces a symmetry breaking mechanism, otherwise absent in the Bayesian dynamical system. Please, have a look at https://lnkd.in/dGaDV8Ac
Here you can see some pictures of him that I took when he came to Barcelona in 2023 for a talk about Artificial Molecular Machines.
Over the course of his career, Stoddart has published more than 1,400 scientific articles—with approximately 109,000 citations—and has supervised more than 480 researchers, of whom approximately 115 were research trainees (doctoral students). Source: Universitat de Barcelona
Fraser Stoddart, won Nobel Prize in chemistry in 2016. During the talk, he said that his most important legacy will be mentoring his several students and postodocs.
Today the three Kings brought me a second badge (see Figure on the left) that shows dedication and improved skills in Data Fundamentals online course offered by IBM.
I enjoyed the Basics of Quantum Information course offered by IBM. Some quantum mechanical problems for passing the exam are challenging, but my quantum mechanics knowledge was useful again. For people who wish to learn some quantum mechanics, I suggest studying it in Sakurai's Modern Quantum Mechanics where Dirac's bras and kets are used throughout the book. My new badge for passing the relative exam is shown on the left. Below I made a list of some good books I would suggest for learning quantum mechanics as a physicist (however the list below is not exhaustive, there are many books on QM):
J. Sakurai: Modern Quantum Mechanics
F. Mandl: Quantum Mechanics
L. Susskind: Quantum Mechanics Theoretical Minimum
07-06-2025 (Andrews' Curves) I found a very simple and useful method to visualize high-dimensional data points by mapping them to Andrews' curves. Below, I plot the Andrews' curves of a dataset consisting of 88 observations in a 9-dimensional Euclidean space.
Notably, the two most dissimilar curves—shown in blue and red—correspond to the data points with the largest Euclidean distance between them, which is approximately 1.7 (see the accompanying histogram of all possible pairwise distances). This correspondence arises because the distance between Andrews' curves is simply the Euclidean distance, rescaled by a constant factor equal to π.
15-06-2025. Finally, I have uploaded the data from high-dimensional trajectories, consisting of 4, 000, 000 datapoints, of the Fermi-Pasta-Ulam-Tsingou model to Zenodo.The orbits have been accurately computed using a suitable symplectic integrator, and in some cases, validated through comparison with previous simulations to confirm the accuracy of the data.
This dataset is a bonanza for investigating the connection between ergodicity and the geometry and topology of the manifolds on which the trajectories lie or evolve near. It is particularly valuable for applying state-of-the-art unsupervised machine learning techniques such as manifold learning and topological data analysis.
A particularly intriguing direction is to explore how increasing the model’s nonlinearity leads to symmetry breaking, and how this relates to changes in the underlying Riemannian manifold.
21th June 2025. My preliminary computations using a Variational Autoencoder (VAE) are consistent with results from a manifold reduction approach based on t-SNE, applied to high-dimensional trajectory data of the Fermi–Pasta–Ulam–Tsingou (FPUT) model. The data lie in a 64-dimensional phase space corresponding to a system of 32 coupled harmonic oscillators. Both unsupervised machine learning algorithms yield two-dimensional embeddings (shown in the left and right panels below for the VAE and t-SNE, respectively), revealing that the system evolves along a closed orbit under weak nonlinearity (β = 0.1). Remarkably, unlike t-SNE, the VAE does not require an informative initialization to capture both the global and local structure of the data. #AI4Science