Thank you to all who participated in SUMS 2020: MATS, and especially to our presenters! We had talks from professors of philosophy, neuroscience, computer science, and of course, mathematics. The idea was to investigate different conceptions of thinking systems across disciplines. In philosophy, we discuss what it means to be a thinking system. In neuroscience, we study how the most successful thinking system ever is built. In computer science, we attempt to build thinking systems from the ground up. And finally, in math we use thinking systems to understand truth.
Mathematics is one of the most remarkable human enterprises, however we still lack a comprehensive understanding of how the brain computes even simple arithmetic problems. In this talk, I will present a series of studies in which we used measures of behavior and brain activity to investigate the encoding of elementary math, as well as to better understand the neural architecture, dynamics and causal role of the underlying brain networks. I will show that math reasoning seems to occupy a distinct network in the human brain, which largely dissociates from language areas and highly overlaps with regions related to object recognition, visuospatial attention, working memory and relational thinking. Next, using machine learning and intracranial recordings, I will demonstrate how we can track the cascade of unfolding representational codes during mental arithmetic, shedding light on the precise roles and temporal dynamics of each hub of the math network. Overall, this talk will provide insights on how elementary math concepts are implemented in the brain and, more broadly, show how the case study of math cognition can help us unveil the algorithms of human intelligence.
When two (distinct) points mate they beget a line, when two (distinct) lines mate, they beget a point. What can be simpler than that? Yet if you let them “be fruitful and multiply” things get interesting.
A .pdf of these titles and abstracts is available here.
Abstract: In the 1840s, Mobius recounted to his students the following fairy tale: There was once a prince of the Far East who had five sons. The prince stipulated that upon his death his kingdom was to be divided between his sons in polygonal plots so that each son had land bordering the land of each of his brothers. Try as they might, the prince’s surveyors could not devise such a division. In fact, no such division can exist on the surface of the Earth. But on the torus and on surfaces of higher genus, maps of five regions, each region bordering each other region, do exist. Maps in which each region borders each other region are called complete maps. There is a maximum number of regions in a complete map on a given surface, and complete maps with this maximum number of regions are said to be maximally complete. We show that maximally complete maps on orientable surfaces other than the sphere and torus do not have certain symmetry properties. Additionally, we give new, explicit maximally complete maps on orientable surfaces of genus 3 through 7.
Abstract: Finding unexpected connections across various areas of mathematics is exciting and can offer new perspectives on various thoroughly-studied concepts. In this spirit, we will look at how one of the most-studied sequences—the Fibonacci sequence—makes an elegant appearance within graph theory. Generalizing and expanding upon a method of proof proposed by Lee Sanders, we will prove two generalizations of the theorem that Sanders presents by examining how three Fibonacci identities describe the structure of a specific class of tree graphs. Finally, we will make a connection to an NP-hard problem that offers further insight into how the Fibonacci sequence governs the structure of these graphs.
Abstract: We introduce the notion of smocked metric spaces and explore the balls and geodesics in a collection of different smocked spaces. We find their rescaled Gromov-Hausdorff limits and prove these tangent cones at infinity exist, are unique, and are normed spaces.
Abstract: An important feature of any thinking machine would be the representation through which it stores, recalls, and manipulates information. It is challenging to capture the nuances of highly semantic concepts, such as the contextual meaning of a word or the precise pixel-wise structure of a portrait. This talk will detail new methodologies from the field of deep learning for constructing information representations which enable semantic manipulation of data. As a concrete example, the Word2Vec algorithm permits representations of words in a vector space such that vector addition of representations corresponds to semantic “addition” of words. Experimentally, the vector “german” + “airlines” is near “Lufthansa.” Emphasis will be placed on the theoretical formulations of neural generative modelling and unsupervised representation learning, the intuitions that connect this theory to exciting experimental results, and on the implications of learned representations for designing thinking machines.
Max has kindly made the slides from this presentation available here.
Abstract: Locally interacting Markov chains describe the joint evolution of a large collection of particles, in which the evolution of each particle depends only on its neighbors within a specified underlying interaction graph. We characterize the marginal dynamics (of a typical particle) for such models on Erdös-Renyi random graphs and deterministic heterogeneous graphs which have dense clusters connected by sparse structures. We will discuss applications of such models to opinion dynamics and neuroscience.
Abstract: Let f : P1 → P1 be a rational function, where P1 is the projective line. We know that conjugating f by an element PGL2 will produce another rational function with similar dynamical behavior. Define Ratd as the set of degree d rational maps on P1 and the moduli space of degree d rational maps, Md, as the quotient space Ratd / PGL2(Q ̄).
The automorphism loci of the moduli space corresponds to the set of rational functions with non-trivial automorphism groups and can be further broken down into loci where the automorphism group contains a particular finite subgroup of PGL2. We are able to describe the automorphism loci in M3 and M4 in terms of some normal forms, i.e., n-parameter families in Rat3 and Rat4 such than n equals to the dimension of the appropriate automorphism loci in the moduli space. Using the normal forms, we are also able to obtain some uniform boundedness results on the rational preperiodic structures of several one-parameter families with non-trivial automorphism groups. Additionally, we look at applications of the automorphisms to statistics over finite fields.
Abstract: The game of Chomp is a turn-based game where players take turns removing rectangular sections from a chocolate bar. The goal is to force the other player to take away the last piece, a designated corner square. Superficially Chomp appears to be a form of NIM, but it has not yet been reduced to a NIM sum. A winning strategy for the first player has been shown to exist for all rectangular boards, but beyond a few trivial cases, that general strategy remains unknown.
Our approach has included creating mathematical models of Chomp which we use to create a series of computer programs to produce winning and losing board states. With these algorithms, we have increased the bound of solved game sizes from 14x14 to 19x19 despite the number of possible board states of an AxB game being A+B choose A.
Our findings include hundreds of millions of new pole positions which are useful for identifying and expanding upon patterns in the pole positions of boards. We have identified a pruning condition to restrict the search space to an exponential order as well as the concept of leading edge pole positions which may yield further results.
Abstract: Order statistics play an important role in multiple disciplines such as outlier detection, wireless communication, and hypothesis testing. To better understand their behavior and properties, we examine their joint cumulative distribution functions. While there exist closed-form expressions and algorithms to compute this quantity, most are computationally intractable. We present an efficient dynamic programming algorithm for this computation, first assuming i.i.d. random variables, and then we extend our algorithm to allow for a subset of order statistics and multiple populations. The best known previous algorithms are exponential with respect to the number of order statistics whereas our algorithm is polynomial.
Abstract: Recently, topological data analysis (TDA) has gained traction as a new approach to analyze point-cloud and time-series data. Using persistent homology, we can obtain information about the “shape” of our data, by tracking the existence of connected components, loops/rings (i.e. 2-D holes), voids (i.e. 3-D holes), and other “holes” in higher dimensions through varying sizes and over time. Additionally, similarity between datasets can be evaluated by defining a metric to compute distances between the two TDA-derived structures. We use this approach to investigate behavior in simulations of active particle models. The model exhibits self-organizing behavior typically associated with swarm intelligence and active matter. In our model, agents behave like epithelial cells and can form either monolayers, spanning networks and compact clusters under different parameter assignments. We show that TDA is a robust and effective technique for identifying phase transitions between different swarming behaviors such as spanning, clusters, or monolayers and classifying them.