TIMETABLE:
Wednesday 16.00 -19.00 - Aula Careri
Thursday 16.00-19.00- Aula Careri
Lectures:
25/09/2024 (3 h)
Introduction to complex systems.
Accompanying slides on the Prolusion [PDF]
26/09/2024 (3 h)
(1) Introduction to probability density functions (pdf). Observables and non-normalizable distributions. Distribution of a dependent variable and relation with the generation of random numbers. The sum of two random variables. Characteristic functions and Convolution theorem. Cumulants for the sum and the average of i.i.d random variables. (2) Moments for the sum and the average of i.i.d. random variables. Derivation of the Central Limit Theorem for distributions with well-defined moments. Maximum entropy principle for pdf defined on the whole real axis with means and variance finite and fixed. (3) Introduction to power-laws: normalization and representations. Frequency-rank plot and its relation with the pdf. Maximum entropy principle for power-laws. Generation of random variables with a generic probability distribution: the case of power-laws.
02/10/2024 (3 h)
(1) Criterion of Maximum Likelihood. Application to the estimation of the exponent of a power-law and the error on the estimate. Estimate of the maximum value of a sample drawn from a power-law pdf. Probability distribution functions for extreme events drawn from a power-law distribution. Fréchet distribution. (2) Cauchy-Lorentz distribution. Calculation of the first and second moments. Computation of the characteristic function and proof that a sum of Cauchy-distributed random variables is again Cauchy-distributed. (3) Lévy's distribution: calculation of the moments. Stable distributions and alpha-stable distributions. Central Limit Theorem for long-tailed distributions with well-defined variance.
03/10/2024 (3 h)
Central Limit Theorem for long-tailed distributions without a well-defined variance: the case of the Cauchy distribution as an alpha-stable distribution. Benford's law. Empirical observations, scaling arguments and explanation based on multiplicative processes. More on multiplicative processes and log-normal distributions. Zipf's law: empirical observations, normalization depending on the value of the exponent. Relationship between the Zipf's exponent and the exponent of the usual probability distribution function. Heaps' law: empirical observations. Taylor's law. Example of its derivation for a Poissonian process.
09/10/2024 (3 h)
First notions on scaling: Galileo (1638) and examples from biology and cities. Scale-invariance and power-laws. Mechanisms leading to the emergence of power-laws: combination of exponentials: Monkey typing, inverse quantities. Yule-Simon process: calculation of the probability distribution of frequencies and the frequency rank distribution.
10/10/2024 (3 h)
Again on the Yule-Simon model. Solution with the quenched version of the model: at each time step, m species are reinforced, and one brand-new species is introduced. First-return times in a one-dimensional Random Walk. Introduction to critical phenomena. Example of percolation: Phenomenology and Real Space Renormalisation group in 1-D and for a 2-D triangular lattice. Self-Organised Criticality (SOC). Introduction to the overall phenomenology. Criticality vs. Self-Organised Criticality: role of time-scale separations in SOC to be tuned at the critical point. Sandpile model. Definition of the main quantities and of the critical exponents. A Brief introduction to the Forest Fire models. Hints for the Real Space Renormalisation Group for Forest Fire models.
Accompanying slides about Power-laws and Scale Invariance [POWER_LAWS.pdf]
16/10/2024 (2 h)
Derivation of the functional form of the Shannon entropy from three main requirements. Relation between the Shannon entropy and the thermodynamic entropy. Cross-entropy and relative entropy (Kullback-Leibler divergence). Jensen's inequality and positivity of the Kullback-Leibler divergence.
17/10/2024 (3 h)
Mutual information. Demonstration that conditioning lowers the entropy.
N-block entropy and differential entropy. The entropy rate of stationary stochastic processes. Entropy rate for i.i.d. random variables and Markovian processes. Shannon-McMillan-Breiman theorem (AEP). Demonstration for i.i.d. random variables. Typical sequences. Consequences for coding: interpretation of the Shannon entropy as the average number of bits per character in binary optimal coding.
23/10/2024 (2 h)
Recap of the interpretation of the Shannon entropy as the average number of bits per character in binary optimal coding. Equivalence of canonical and microcanonical ensembles through AEP. Symbol codes. Non-singular, uniquely decodable and instantaneous (prefix-free) codes. Codewords in prefix-free codes are leaves of binary trees. Kraft equality in rooted binary trees. Kraft inequality for prefix-free codes. Entropy bounds for the average expected length of prefix-free codes.
24/10/2024 (3 h)
Kraft inequality for uniquely decodable codes. Huffman code and its optimality. Entropy estimation. Estimated N-block entropy is a lower bound for the true N-block entropy. The Shannon Game and proportional gambling to estimate the entropy of a language.
30/10/2024 (2 h)
Algorithmic (Kolmogorov) complexity. Definition and properties. Its relation with the Shannon entropy. Compressors. LZ77 and its asymptotical optimality. Introduction to deterministic chaos.
06/11/2024 (2 h)
Logistic map and Bernoulli shift map. Symbolic dynamics. Kolmogorov-Sinai entropy of dynamical systems.
Slides [PDF]
07/11/2024 (3 h)
The entropy of a continuous random variable (differential entropy). The maximum entropy principle. The equivalence of maximising the likelihood and satisfying the constraints to compute the Lagrange multiplier. The consistency of maximum likelihood. Maximum likelihood and Bayesian predictions for the probability of a Bernoulli variable. The beta distribution and the beta function. The conjugacy of the beta distribution with the Bernoulli distribution. The Polya urn model. Exchangeability. Probabilities and moments in the Polya urn model.
13/11/2024 (2 h)
Limit distribution of the Polya urn model. The de Finetti theorem (only stated). The connection between the Polya model and the Bayesian inference. Introduction to innovation dynamics: Heap's and Zip's law in different datasets.
15/11/2024 (3 h) Lecture on innovation dynamics. Slides [PDF]
20/11/2024 (3 h) (1) Introduction to network science. Phenomenology in several domains: technological networks, information networks, social networks. (2) Basic notions of graph theory: undirected and directed networks, unweighted and weighted networks, single and multiple edges, self-edges, cycles, Directed Acyclic Graphs (DAG). Adjacency matrix. Bipartite networks and Incidence matrix. Projections of bipartite networks. Trees. (3) Degree, in-degree, out-degree, mean degree, density. Walks, paths and loops and their computation in terms of the adjacency matrix. Laplacian matrix and its properties.
21/11/2024 (3 h) No lecture
27/11/2024 (3 h) (1) Laplacian matrix and its properties. Eigenvalues and eigenvectors. Random walks on networks. (2) Centrality measures: degree, eigenvector centrality, Katz centrality. PageRank centrality. Betweenness centrality. (3) Transitivity and clustering coefficient. Assortativity. Random graphs G(n,m) and G(n,p).
28/11/2024 (3 h) (1) Random graphs G(n,p) and their properties: diameter, size of the giant component. Configuration model. (2) Knn assortativity and its interpretation in the configuration model. Friendship paradox and excess degree distribution. Models of network formation. Preferential attachment. (3) Barabasi-Albert model: definition and calculation of the degree distribution via two different methods: Differential equation for the evolution of the degree. Master equations for the in-degree and the total degree distribution. First notions of epidemic spreading. SI model.
4/12/2024 (3 h) (1) Models SI, SIR, and SIS under the homogeneous mixing hypothesis. Graphical solutions and identification of the epidemic threshold. Basic reproduction number and its link with the epidemic threshold. (2) Epidemics on networks. Transmission probability and mapping the epidemic threshold to the bond percolation. The SIR model on networks. Calculation of the epidemic threshold for random graphs and power-law degree distributions. Considerations on the zero value that the epidemic threshold can acquire for specific networks. Size of the giant component. SIS model on networks: Degree Based Mean Field approach. Derivation of the epidemic threshold and the order parameter (density of infected nodes). (3) Prolusion to the study of social dynamics. Historical perspective and main challenges. Models for the formation of consensus. Main mechanisms (social pressure, imitation, homophily, etc.).
Accompanying slides about Networks [PDF]
5/12/2024 (3 h) Kinetic Ising model at zero temperature. Voter model. Equivalence of the Voter model in 1d with the Kinetic Ising Model at T=0 in 1-d. Fokker Planck equation for the Voter model in mean field. Detailed analysis of the Fokker Planck equation for the Voter model in mean field. Derivation of the Exit probability and of the Consensus time. Comparison with the Ising model in mean field and with a local field composed by k randomly chosen neighbours. Classification of Opinion dynamics models. A few examples: the majority rule model, the Sznajd model, the Deffuant model. Axelrod's model for cultural assimilation. Discussion on the 1st and 2nd order phase transitions.
11/12/2024 (3 h) Specialistic seminar on "Infosphere" [PDF]
12/12/2024 (3 h) Specialistic seminar on "Sustainable Cities" [PDF]
18/12/2024 (3 h) Specialistic seminar on "AI and Augmented Creativity"
Foundations of Neural Networks [PDF1]
Intro to Generative Models https://denise-lanzieri-csl.github.io/talks/lecture_cs_2024/.
Augmented Creativity [PDF2]
19/12/2024 (3 h) Specialistic seminar on "Economic Fitness and Complexity" [PDF1][PDF2]