Variational inference via Wasserstein gradient flows
组织者
丘成栋
演讲者
康家熠
时间
2022年12月20日 21:00 至 21:30
地点
Online
摘要
Along with Markov chain Monte Carlo (MCMC) methods, variational inference (VI) has emerged as a central computational approach to large-scale Bayesian inference. Rather than sampling from the true posterior π, VI aims at producing a simple but effective approximation πˆ to π for which summary statistics are easy to compute. However, unlike the well-studied MCMC methodology, algorithmic guarantees for VI are still relatively less well-understood. In this report, I will introduce a paper in which authors propose principled methods for VI, in which πˆ is taken to be a Gaussian or a mixture of Gaussians, which rest upon the theory of gradient flows on the Bures–Wasserstein space of Gaussian measures. Akin to MCMC, it comes with strong theoretical guarantees when π is log-concave.
演讲者介绍
Jiayi Kang received his Ph.D. in Mathematics from Tsinghua University in 2024. He joined the Beijing Institute of Mathematical Sciences and Applications (BIMSA) as an Assistant Researcher in July 2024, and became an Assistant Professor at the Hetao Institute for Mathematical and Interdisciplinary Sciences (HIMIS) in November 2025.
His research focuses on the intersection of deep learning, nonlinear filtering, and computational biology. His main research interests include: neural network-based filtering algorithms and their mathematical foundations, sampling methods in Wasserstein geometry, nonlinear filtering theory (including the Yau-Yau method) and its applications in climate science and other fields, as well as computational genomics and evolutionary system modeling. He is committed to solving complex problems in science and engineering using mathematical and machine learning methods.