BIMSA >
控制理论和非线性滤波讨论班
控制理论和非线性滤波讨论班
Accelerated Linearized Laplace Approximation for Bayesian Deep Learning
Accelerated Linearized Laplace Approximation for Bayesian Deep Learning
组织者
丘成栋
演讲者
康家熠
时间
2023年01月03日 21:00 至 21:30
地点
Online
摘要
I shall report a work on Laplace approximation. Laplace approximation (LA) and its linearized variant (LLA) enable effortless adaptation of pretrained deep neural networks to Bayesian neural networks. The generalized Gauss-Newton (GGN) approximation is typically introduced to improve their tractability. However, LA and LLA are still confronted with non-trivial inefficiency issues and should rely on Kronecker-factored, diagonal, or even last-layer approximate GGN matrices in practical use. These approximations are likely to harm the fidelity of learning outcomes. To tackle this issue, inspired by the connections between LLA and neural tangent kernels (NTKs), we develop a Nystrom approximation to NTKs to accelerate LLA.
演讲者介绍
Jiayi Kang received his Ph.D. in Mathematics from Tsinghua University in 2024. He joined the Beijing Institute of Mathematical Sciences and Applications (BIMSA) as an Assistant Researcher in July 2024, and became an Assistant Professor at the Hetao Institute for Mathematical and Interdisciplinary Sciences (HIMIS) in November 2025.
His research focuses on the intersection of deep learning, nonlinear filtering, and computational biology. His main research interests include: neural network-based filtering algorithms and their mathematical foundations, sampling methods in Wasserstein geometry, nonlinear filtering theory (including the Yau-Yau method) and its applications in climate science and other fields, as well as computational genomics and evolutionary system modeling. He is committed to solving complex problems in science and engineering using mathematical and machine learning methods.