Confirmed Speakers
Automated theorem proving is a subfield of automated reasoning and mathematical logic dealing with proving mathematical theorems by computer programs. Automatic theorem proving software, e.g. ProverX, demonstrated their potential by producing theorems published in leading mathematics journals and solving a substantial number of open problems.
Exploring mathematical structure in language may offer valuable insights into questions surrounding large language models. In this talk, we propose that enriched category theory can provide a natural framework for investigating the passage from texts, and probability distributions on them, to a more semantically meaningful space. We will define a category of expressions in language enriched over the unit interval and then pass to a certain class of functions on those expressions. As we will see, the latter setting has rich mathematical structure and comes with category-theoretical and geometric tools in which to explore that structure.
Symbol Spatialisation provides a geometric method to unify, without any loss, symbolic AI with neural networks and shape novel AI systems that inherit the advantages of both classic AI and neural networks.
Quantum Machine Learning is the integration of quantum algorithms within machine learning programs. The term generally refers to machine learning algorithms for the analysis of classical data executed on a quantum computer, i.e. quantum-enhanced machine learning.
Algebraic Machine Learning is a new Artificial Intelligence paradigm that combines user-defined symbols with self-generated symbols, that permits to learn from the data and adapt to the world like neural networks do, combined with the power for explainability of Symbolic AI.
AI-driven knowledge discovery - using machines it is possible to generate nontrivial algebraic conjectures in mathematics, which often have relationships to existing long-standing conjectures.
Zeta Neural Networks aim to overcome shortcomings of wide multi-layer perceptrons (MLP). Namely, as infinitely wide MLPs are non-parametric, they do not have a well-defined pointwise limit, which is related to the loss of non-Gaussian attributes and the inability to perform feature learning. This is done through ideas of harmonic analysis by breaking the (MLP permutation) symmetry at the level of the perceptrons.
Feature decoupling refers to the process in Data Analysis of minimizing the entanglement, both statistical and algebraic, of features in a given dataset. The redundancy of features has a negative impact on different applications, including Machine Learning, making it a relevant aspect of data processing.
Automated reasoning, also known as "strong artificial intelligence", involves automated deductive reasoning (automated theorem proving), as well as inductive reasoning (machine learning and discovery) and their combining.
Stephen Wolfram is a renowned computer scientist, mathematician, and theoretical physicist best known for his pioneering work in computational mathematics and cellular automata. He founded Wolfram Research in 1987, where he created the well-known software Mathematica and Wolfram Alpha, a computational knowledge engine. Wolfram is currently developing a new foundation for physics and exploring the potential of artificial intelligence in science. He emphasizes the importance of computational thinking in understanding complex systems and proposes integrating AI with specialized computational tools to overcome the limitations of current AI models in tackling complex scientific challenges.