UNIST International Workshop on String Theory, Geometry and Machine Learning 2023
Talks
UNIST International Workshop on String Theory, Geometry and Machine Learning 2023
Talks
Per Berglund (University of New Hampshire)
On Machine Learning Kreuzer-Skarke Calabi-Yau Manifolds
Namkyeong Cho (Samsung SDS)
Using Reinforcement Learnig and Graph Neural Networks in Integer Programming with some Industrial Applications
Combinatorial Optimization (CO) has been a focus of intensive study, leading to the development of numerous heuristic algorithms. Recently, the improvements in the traditional heuristic approach via reinforcement learning have gained much attention. Given that most CO problems can be formulated as Mixed Integer Programming (MIP) problems, this area is increasingly attracting machine learning researchers' interest. This talk will introduce some methodologies and recent results for solving MIP problems using reinforcement learning and GNNs. Furthermore, we shall introduce some of the applications of MIP in industrial demands.
Andrei Constantin (University of Oxford)
Machine Learning for Bundles and Manifolds in String Theory
Sergei Gukov (California Institute of Technology / DIAS)
Learning AC moves
James Halverson (Northeastern University)
Φ4 Theory as a Neural Network Field Theory
Elli Heyes (City, University of London)
New Calabi-Yau Manifolds from Genetic Algorithms
Calabi-Yau manifolds can be obtained as hypersurfaces in toric varieties built from reflexive polytopes. We generate reflexive polytopes in various dimensions using a genetic algorithm. As a proof of principle, we demonstrate that our algorithm reproduces the full set of reflexive polytopes in two and three dimensions, and in four dimensions with a small number of vertices and points. Motivated by this result, we construct five-dimensional reflexive polytopes with the lowest number of vertices and points. By calculating the normal form of the polytopes, we establish that many of these are not in existing datasets and therefore give rise to new Calabi-Yau four-folds. In some instances, the Hodge numbers we compute are new as well.
Eun-Soo Jung (ALI Corp)
AI for Chat-bots: from Smart Assistants to Friends
In light of the swift advancements in large-scale language models, AI chat-bots have recently gained significant attention and generated widespread interest. We explore the evolution of chat-bots and their increasing role in our daily lives. We delve into the design and development of QA chat-bots, examining their capabilities and discussing real-world use cases. However, we also acknowledge the limitations of current chat-bot technology and explore the future directions for improvement and further research.
Andre Lukas (University of Oxford)
Fabian Ruehle (Northeastern University)
Searching for Ribbons with Machine Learning
Andreas Schachner (Cornell University)
JAXVacua -- A Framework for Sampling String Vacua
Moduli stabilisation in string compactifications with many light scalars remains a major blind-spot in the string landscape. In these regimes, analytic methods cease to work for generic choices of UV parameters which is why numerical techniques have to be exploited. In this talk, I report on new numerical techniques to efficiently construct string vacua. This approach heavily utilises automatic differentiation, just-in-time compilation and parallelisation features. I argue that this implementation provides a golden opportunity to efficiently analyse large unexplored regions of the string landscape. As a first example, I report on the application of our techniques to the search of Type IIB flux vacua in Calabi-Yau orientifold compactifications.
Gary Shiu (University of Wisconsin-Madison)
Special Colloquium
The Topology of Data: from String Theory to Cosmology to Phases of Matter
We are faced with an explosion of data in many areas of physics, but very so often, it is not the size but the complexity of the data that makes extracting physics from big datasets challenging. As I will discuss in this talk, data has shape and the shape of data encodes the underlying physics. Persistent homology is a tool in computational topology developed for quantifying the shape of data. I will discuss three applications of topological data analysis: 1) identifying structure of the string landscape, 2) constraining cosmological parameters from CMB measurements and large scale structures data, and 3) detecting and classifying phases of matter. Persistent homology condenses these datasets into their most relevant (and interpretable) features, so that simple statistical pipelines are sufficient in these contexts. This suggests that TDA can be used in conjunction with machine learning algorithms and improves their architecture.