Past Talks - Fall 2021

Nov 18, 2021: Nicole Immorlica (MSR):
"
Communicating with Anecdotes"

Abstract: Classic models of communication in economics typically assume agents can communicate any message. However, many important communications, such as those in newspapers or politicians' speeches, use data to convey information. In this talk, we explore how the reliance on data impacts communication. In our model, there are two Bayesian agents (a sender and a receiver) who wish to communicate. The receiver must take an action whose payoff depends on their personal preferences and an unknown state of the world. The sender has access to a collection of data points correlated with the state of the world and can send exactly one of these to the receiver in order to influence her choice of action. Importantly, the sender's personal preferences may differ from the receiver's, which affects the sender's strategic choice of what to send. We show that in a Nash equilibrium even a small difference in preferences can lead to a significant bias in the communicated datum. This can significantly reduce informativeness of the communication, leading to substantial utility loss for both sides. One implication is informational homophily: a receiver can rationally prefer to obtain data from a poorly-informed sender with aligned preferences, rather than a knowledgeable expert whose preferences may differ from her own.

Joint work with Nika Haghtalab, Brendan Lucier, Markus Mobius and Divya Mohan.

Bio: Nicole Immorlica's research lies broadly within the field of economics and computation. Using tools and modeling concepts from both theoretical computer science and economics, Nicole hopes to explain, predict, and shape behavioral patterns in various online and offline systems, markets, and games. Her areas of specialty include social networks and mechanism design. Nicole received her Ph.D. from MIT in Cambridge, MA in 2005 and then completed three years of postdocs at both Microsoft Research in Redmond, WA and CWI in Amsterdam, Netherlands before accepting a job as an assistant professor at Northwestern University in Chicago, IL in 2008. She joined the Microsoft Research New England Lab in 2012.

anecdotes.pdf

Oct 21, 2021: Maxim Raginsky (UIUC):
"
Neural SDEs: Deep Generative Models in the Diffusion Limit"

Abstract: In deep generative models, the latent variable is generated by a time-inhomogeneous Markov chain, where at each time step we pass the current state through a parametric nonlinear map, such as a feedforward neural net, and add a small independent Gaussian perturbation. In this talk, based on joint work with Belinda Tzen, I will discuss the diffusion limit of such models, where we increase the number of layers while sending the step size and the noise variance to zero. I will first provide a unified viewpoint on both sampling and variational inference in such generative models through the lens of stochastic control. Then I will show how we can quantify the expressiveness of diffusion-based generative models. Specifically, I will prove that one can efficiently sample from a wide class of terminal target distributions by choosing the drift of the latent diffusion from the class of multilayer feedforward neural nets, with the accuracy of sampling measured by the Kullback-Leibler divergence to the target distribution. Finally, I will briefly discuss a scheme for unbiased, finite-variance simulation in such models. This scheme can be implemented as a deep generative model with a random number of layers.

Bio: Maxim Raginsky received the B.S. and M.S. degrees in 2000 and the Ph.D. degree in 2002 from Northwestern University, all in Electrical Engineering. He has held research positions with Northwestern, the University of Illinois at Urbana-Champaign (where he was a Beckman Foundation Fellow from 2004 to 2007), and Duke University. In 2012, he has returned to the UIUC, where he is currently an Associate Professor and William L. Everitt Fellow with the Department of Electrical and Computer Engineering, the Coordinated Science Laboratory, and the Department of Computer Science. His research interests cover probability and stochastic processes, deterministic and stochastic control, machine learning, optimization, and information theory. Much of his recent research is motivated by fundamental questions in modeling, learning, and simulation of nonlinear dynamical systems, with applications to advanced electronics, autonomy, and artificial intelligence.

diffusions.pdf

Sep 23, 2021: Christos Papadimitrou (Columbia University):
"
How does the brain beget the mind?"

Abstract: How do molecules, cells and synapses effect reasoning, intelligence, planning, language? Despite dazzling progress in experimental neuroscience, as well as in cognitive science at the other extreme of scale, we do not seem to be making progress in the overarching question: the gap is huge and a completely new approach seems to be required. As Richard Axel recently put it: "We don't have a logic for the transformation of neural activity into thought [...]."


What kind of formal system would qualify as this "logic"?


I will introduce the Assembly Calculus (AC), a computational system which appears to be a promising bridge between neurons and cognition. Through this programming framework, a Parser was recently implemented which (a) can handle reasonably complex sentences of English and other languages, and (b) works exclusively through the spiking of neurons.

Bio: One of world’s leading computer science theorists, Christos Papadimitriou is best known for his work in computational complexity, helping to expand its methodology and reach. He has also explored other fields through what he calls the algorithmic lens, having contributed to biology and the theory of evolution, economics, and game theory (where he helped found the field of algorithmic game theory), artificial intelligence, robotics, networks and the Internet, and more recently the study of the brain.

He authored the widely used textbook Computational Complexity, as well as four others, and has written three novels, including the best-selling Logicomix and his latest, Independence. He considers himself fundamentally a teacher, having taught at UC Berkeley for the past 20 years, and before that at Harvard, MIT, the National Technical University of Athens, Stanford, and UC San Diego.


Papadimitriou has been awarded the Knuth Prize, IEEE’s John von Neumann Medal, the EATCS Award, the IEEE Computer Society Charles Babbage Award, and the Gödel Prize. He is a fellow of the Association for Computer Machinery and the National Academy of Engineering, and a member of the National Academy of Sciences.


He received his BS in Electrical Engineering from Athens Polytechnic in 1972. He has a MS in Electrical Engineering and a PhD in Electrical Engineering/Computer Science from Princeton, received in 1974 and 1976, respectively.


Due to technical issues, this talk could not be recorded. We apologize for the inconvenience.

ESSLLI.pdf