Foretell of Future AI from Mathematical Foundation
AAAI 2026 Workshop
January 26, 2026 | Singapore | Singapore EXPO
AAAI 2026 Workshop
January 26, 2026 | Singapore | Singapore EXPO
Recent breakthroughs in artificial intelligence trace their lineage to foundational mathematical discoveries spanning centuries. The evolution of artificial intelligence (AI) from a mathematical perspective began with early foundations in mathematical logic and computation, which established the theoretical underpinnings of mechanized reasoning. The shift from symbolic AI to statistical methods in the 1980s marked a significant transition, leading to the development of neural networks that utilized linear algebra and calculus for learning. Modern advancements in AI, particularly deep learning, leverage complex mathematical frameworks such as probability theory and manifold learning to handle high-dimensional data. The recent Kolmogorov-Arnold Networks model exemplifies this enduring synergy, applying approximation theory to create more interpretable architectures that better align with physical constraints. Moreover, the field of AI for science has seen the emergence of specialized architectures designed to incorporate physical laws and domain-specific knowledge, leading to breakthroughs in scientific machine learning applications. These developments bring a large potential of AI but also raise important questions about the theoretical foundations underlying these complex architectures, making it crucial to explore the mathematical foundation of deep learning architectures.
This workshop emphasizes the analysis of deep neural network architectures' expressiveness, optimization dynamics, and new architectures inspired by mathematical insights. We will explore how mathematical foundations can inspire deep learning in optimization, training algorithms, and architecture, and use mathematical tools, approximation theorems to evaluate new AI designs. The workshop will focus on the expressivity of new architectures, examining how their structural properties enhance approximation ability. We will also focus on the optimization dynamics of models, investigating how mathematical theory improves training and convergence. Furthermore, the workshop will address the crucial question of the foundational theory of scientific machine learning models and understand the reasoning behind model architectures. By bringing together experts in these areas, we aim to push the development of new AI architectures from mathematical insights, discuss potential mathematical theories used for model design, and inspire more analysis of AI.
The workshop will follow a one-day format and feature a mix of invited talks from leading experts, contributed presentations from submitted papers, poster sessions, and panel discussions. This format is designed to maximize interaction between participants and foster meaningful dialogues that can lead to new collaborations and research directions. The two poster sessions and panel discussions will boost the discussion among attendees about the mathematical foundations which can inspire new types of deep learning algorithms or architectures.