Program and Abstract

PART 1: 10:00-11:00

Shi Jin (Plenary Speaker)

Sanghai Jiao Tong University


Quantum Computation of partial differential equations and linear algebra problems

Quantum computers have the potential to gain algebraic and even up to exponential speed up compared with its classical counterparts, and can lead to technology revolution in the 21st century. Since quantum computers are designed based on quantum mechanics principle, they are most suitable to solve the ‘Schrodinger equation, and linear PDEs (and ODEs) evolved by unitary operators. The most efficient quantum PDE solver is quantum simulation based on solving the Schrodinger equation. 

It will be interesting to explore what other problems in scientific computing, such as ODEs, PDEs, linear algebra and optimization problems, can be handled by quantum simulation. became challenging for general PDEs, more so for nonlinear ones, and, We will first give a short “mathematician's survival kit” on quantum computing, then discuss three topics:

1) We introduce the “warped phase transform” to map general linear PDEs and ODEs to Schrodinger equation or with unitary evolution operators in higher dimension so they are suitable for quantum simulation. This method also allows us to do quantum simulation for iterative methods in linear algebra.

2) For (nonlinear) Hamilton-Jacobi equation and scalar nonlinear hyperbolic equations we use the level set method to map them exactly-to phase space linear PDEs so they can be implemented with quantum algorithms and we gain quantum advantages for various physical and numerical parameters.

3) For PDEs with uncertain coefficients, we introduce a transformation so the uncertainty only appears in the initial data, allowing us to compute ensemble averages with multiple initial data with just one run, instead of multiple runs as in Monte-Carlo or stochastic collocation type sampling algorithms.

PART 2: 11:10-12:00

YoungJoon Hong

KAIST


Toward bridging a connection between machine learning and applied mathematics

This lecture explores the topics and areas that have guided my research in computational mathematics and deep learning in recent years. Numerical methods in computational science are essential for comprehending real-world phenomena, and deep neural networks have achieved state-of-the-art results in a range of fields. The rapid expansion and outstanding success of deep learning and scientific computing have led to their applications across multiple disciplines. In this lecture, I will focus on connecting machine learning with applied mathematics, specifically discussing topics such as adversarial examples, generative models, and scientific machine learning.

PART 3: 13:30-14:20

Jinwoo Shin

KAIST


Deep Learning for Videos: Representation and Generation

In the last ten years, there has been remarkable progress in deep learning models for solving image-related tasks, including classification, generation and representation. However, that for video-related tasks is arguably slow, despite its practical importance. In this talk, I will present my recent works on deep learning methods for better modeling videos.

PART 4: 14:30-15:20

Sangdoo Yun

NAVER CLOUD Al Lab (Research Director)


Towards Strong and Robust Vision Models: Insights from Data and Supervision

This talk delves into the frontier of developing robust and strong vision models. The talk aims to discuss and understand the challenges of robust vision models, particularly from data and supervision perspectives. The challenges include data scarcity, dataset bias, annotation noises, imperfect supervision, etc. Based on the insights from the data and supervision perspectives, we introduce our solutions toward robust and strong vision models. Our simple yet powerful methods come with negligible additional costs, further emphasizing their efficiency. This talk will benefit those interested in pushing the boundaries of vision models' performance.

PART 5: 16:00-17:00

Alessio Figalli (Plenary Speaker, ZOOM)

ETH Zurich


Optimal transport: math and beyond

The concept of optimal transport, initially conceived by Gaspard Monge in the late 18th century to find the most efficient way of transporting a distribution of material from one place to another to build fortifications, has evolved into a powerful framework. Over the past three decades, optimal transport theory has witnessed a wide range of applications, spanning from natural phenomena to machine learning. In this presentation, I will highlight the significant applications that have shaped my own research journey and delve into recent advancements that have further expanded the scope of this theory.

PART 6: 17:10-18:10

Young-Heon Kim (Plenary Speaker)

UBC


A few directions in optimal transport

Optimal transport considers efficient matchings between distributions, and it has applications broadly in mathematical analysis, probability, economics, physics, and data science, among others. We will describe a few research directions where the ideas of optimal transport are applied, including those in stochastic optimization problems, free boundary partial differential equations, as well as biological data analysis.