Approximation Theory of Deep Learning from the Dynamical Systems Viewpoint

Speaker:  Qianxiao Li

Date:    2022.12.08

Time:    10:00 am

Venue: BIMSA 1129B

Zoom:  537 192 5549   Passcode: BIMSA

Video

      

Abtract:

In this talk, we present some recent results on the approximation theory of deep learning from the dynamical systems viewpoint. This viewpoint highlights a key new aspect of modern deep learning, namely the presence of compositional/dynamical structures. We first discuss mathematical frameworks to study the capacity of deep feed-forward architectures for function approximation. Next, we discuss approximation theories of modern architectures for sequence modelling, including recurrent neural networks, dilated convolutional networks (WaveNet), and encoder-decoder structures. These analyses reveal some interesting connections between approximation, dynamics, memory, sparsity and low rank phenomena that may guide the practical selection and design of these network architectures.

 

Speaker Intro.

Qianxiao Li is an assistant professor in the Department of Mathematics, National University of Singapore. He graduated with a BA in mathematics from University of Cambridge and a PhD in applied mathematics from Princeton University. His research interests include the interplay of machine learning and dynamical systems, stochastic gradient algorithms and the development of data-driven methods for solving problems in the physical sciences. He is a recipient of the NRF fellowship in 2021.

 

 

 

[BIMSA-Tsinghua Seminar on Machine Learning and Differential Equations]