Bandit Learning in Matching Markets
(IJCAI 2025)
Montreal, Canada.
Bandit Learning in Matching Markets
(IJCAI 2025)
Montreal, Canada.
Slides
Presenters
Shuai Li (Shanghai Jiao Tong University)
Zilong Wang (Shanghai Jiao Tong University)
Contact email: shuaili8@sjtu.edu.cn
Descriptions
Matching Markets is fundamental in economics and game theory, aiming at developing matching results to achieve the desired objective. Stable matching is an important problem in this field that characterizes the equilibrium state among agents. Due to that agents usually have uncertain preferences, bandit learning recently attracted substantial research attention in this problem. This line of work mainly focuses on the algorithms’ stable regret which characterizes the utility of agents, and incentive compatibility which characterizes the robustness of the system. This tutorial comprehensively introduces the latest advancements and pioneering results in this problem, with a particular emphasis on these two metrics. To investigate the application of bandit algorithms in other areas of mechanism design, the tutorial also broadens its scope by introducing the bandit learning problem in auctions.
Schedule
Two-sided Matching Markets
Multi-Armed Bandits
Bandit Learning in Matching Markets
Beyond Matching Markets
The target audience for this tutorial includes researchers working in, or showing a keen interest in, the domains of online learning and mechanism design. For those in online learning, this tutorial elucidates the nuances of algorithmic design within the context of multi-player game interactions. This knowledge would be beneficial for strategizing around player cooperation and competition in online learning environments. For those in mechanism design, this tutorial also provides valuable perspectives on how to craft effective mechanisms when dealing with agents’ unknown preferences.
Prior familiarity with basic concepts in mechanism design and multi-armed bandits (MAB) would be beneficial for fully grasping the content presented in this tutorial.