Generalized Variational Optimal Estimator
组织者
演讲者
刘士琪
时间
2024年04月10日 15:00 至 15:30
地点
Online
摘要
Bayesian filtering serves as the mainstream framework for dynamic systems state estimation from noisy observations. However, most observations are contaminated by outliers, leading to a rapid decrease in accuracy. In this research, we analyze Bayesian inference under an optimization perspective, called the Generalized Bayesian inference framework. Within this framework, we investigate the robustness of various divergences and find that the Kullback-Leibler Divergence (KLD) is sensitive to outliers, rendering standard Bayesian inference non-robust. We develop a robust variational filtering method, called Generalized Variational Optimal Estimator (GVOE), utilizing the 𝛽 divergence and propose a self-supervised training method for approximating the optimization problem using neural networks. To validate the robustness of GVOE, we conduct simulations using classical atmospheric dynamics models, namely Lorenz96 and Vissio-Lucarini 20, achieving improvements of 42.8% and 98.8% in accuracy and speed, respectively.