Beijing Institute of Mathematical Sciences and Applications Beijing Institute of Mathematical Sciences and Applications

  • About
    • President
    • Governance
    • Partner Institutions
    • Visit
  • People
    • Management
    • Faculty
    • Postdocs
    • Visiting Scholars
    • Staff
  • Research
    • Research Groups
    • Courses
    • Seminars
  • Join Us
    • Faculty
    • Postdocs
    • Students
  • Events
    • Conferences
    • Workshops
    • Forum
  • Life @ BIMSA
    • Accommodation
    • Transportation
    • Facilities
    • Tour
  • News
    • News
    • Announcement
    • Downloads
About
President
Governance
Partner Institutions
Visit
People
Management
Faculty
Postdocs
Visiting Scholars
Staff
Research
Research Groups
Courses
Seminars
Join Us
Faculty
Postdocs
Students
Events
Conferences
Workshops
Forum
Life @ BIMSA
Accommodation
Transportation
Facilities
Tour
News
News
Announcement
Downloads
Qiuzhen College, Tsinghua University
Yau Mathematical Sciences Center, Tsinghua University (YMSC)
Tsinghua Sanya International  Mathematics Forum (TSIMF)
Shanghai Institute for Mathematics and  Interdisciplinary Sciences (SIMIS)
BIMSA > Seminar on Control Theory and Nonlinear Filtering Nonlinear Bayesian Filtering with Natural Gradient Gaussian Approximation
Nonlinear Bayesian Filtering with Natural Gradient Gaussian Approximation
Organizer
Shing Toung Yau
Speaker
Wenhan Cao
Time
Wednesday, October 23, 2024 8:00 PM - 8:30 PM
Venue
Online
Abstract
Practical Bayes filters often assume the state distribution of each time step to be Gaussian for computational tractability, resulting in the so-called Gaussian filters. In nonlinear systems, Gaussian filters such as extended Kalman filter (EKF) or unscented Kalman filter (UKF) typically rely on certain linearization techniques, which can introduce large estimation errors. In this paper, we reconstruct the prediction and update steps of Gaussian filtering as solutions to two distinct optimization problems, whose optimal conditions are found to have analytical forms from Stein’s lemma. It is observed that the stationary point for the prediction step requires calculating the first two moments of the prior distribution, which is equivalent to that step in existing moment-matching filters. In the update step, instead of linearizing the model to approximate the stationary points, we propose an iterative approach to directly minimize the update step’s objective to avoid linearization errors. For the purpose of performing the steepest descent on the Gaussian manifold, we use the natural gradient descent that leverages Fisher information matrix to adjust the gradient direction, accounting for the curvature of the parameter space. Combining this update step with moment matching in the prediction step, we introduce a new filer called Natural Gradient Gaussian Approximation filter, or NANO filter for short. We prove that NANO filter can converge to the optimal Gaussian approximation at each time step, with errors bounded up to a second-order Taylor expansion. The estimation error is proven exponentially bounded for nearly linear measurement equation and low noise levels through constructing a supermartingale-like inequality across consecutive time steps. Real-world experiments demonstrate that, compared to popular Gaussian filters such as EKF, UKF, iterated EKF, and posterior linearization filter, NANO filter reduces the average root mean square error by approximately 45% while maintaining a comparable computational burden.
Beijing Institute of Mathematical Sciences and Applications
CONTACT

No. 544, Hefangkou Village Huaibei Town, Huairou District Beijing 101408

北京市怀柔区 河防口村544号
北京雁栖湖应用数学研究院 101408

Tel. 010-60661855
Email. administration@bimsa.cn

Copyright © Beijing Institute of Mathematical Sciences and Applications

京ICP备2022029550号-1

京公网安备11011602001060 京公网安备11011602001060