BIMSA >
Seminar on Control Theory and Nonlinear Filtering
Stochastic Approximation Beyond Gradient for Signal Processing and Machine Learning
Stochastic Approximation Beyond Gradient for Signal Processing and Machine Learning
Organizer
Stephen S-T. Yau
Speaker
Yangtianze Tao
Time
Monday, August 7, 2023 3:00 PM - 3:30 PM
Venue
数学系理科楼A-203
Abstract
Stochastic Approximation (SA) is a classical algorithm that has had since the early days a huge impact on signal processing, and nowadays on machine learning, due to the necessity to deal with a large amount of data observed with uncertainties. An exemplar special case of SA pertains to the popular stochastic (sub)gradient algorithm which is the working horse behind many important applications. A lesser-known fact is that the SA scheme also extends to non-stochastic-gradient algorithms such as compressed stochastic gradient, stochastic expectation-maximization, and a number of reinforcement learning algorithms. The aim of this presentation is to overview and introduce the non-stochastic-gradient perspectives of SA to the signal processing and machine learning audiences through presenting a design guideline of SA algorithms backed by theories. Our central theme is to propose a general framework that unifies existing theories of SA, including its non-asymptotic and asymptotic convergence results, and demonstrate their applications on popular non-stochastic-gradient algorithms.