BIMSA >
Seminar on Control Theory and Nonlinear Filtering
Seminar on Control Theory and Nonlinear Filtering
Variational inference via Wasserstein gradient flows
Variational inference via Wasserstein gradient flows
Organizer
Stephen S-T. Yau
Speaker
Jiayi Kang
Time
Tuesday, December 20, 2022 9:00 PM - 9:30 PM
Venue
Online
Abstract
Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior π, VI aims at producing a simple but effective approximation πˆ to π for which summary statistics are easy to compute. However, unlike the well-studied MCMC methodology, algorithmic guarantees for VI are still relatively less well-understood. In this report, I will introduce a paper in which authors propose principled methods for VI, in which πˆ is taken to be a Gaussian or a mixture of Gaussians, which rest upon the theory of gradient flows on the Bures–Wasserstein space of Gaussian measures. Akin to MCMC, it comes with strong theoretical guarantees when π is log-concave.
Speaker Intro
Jiayi Kang received his Ph.D. in Mathematics from Tsinghua University in 2024. He joined the Beijing Institute of Mathematical Sciences and Applications (BIMSA) as an Assistant Researcher in July 2024, and became an Assistant Professor at the Hetao Institute for Mathematical and Interdisciplinary Sciences (HIMIS) in November 2025.
His research focuses on the intersection of deep learning, nonlinear filtering, and computational biology. His main research interests include: neural network-based filtering algorithms and their mathematical foundations, sampling methods in Wasserstein geometry, nonlinear filtering theory (including the Yau-Yau method) and its applications in climate science and other fields, as well as computational genomics and evolutionary system modeling. He is committed to solving complex problems in science and engineering using mathematical and machine learning methods.