北京雁栖湖应用数学研究院 北京雁栖湖应用数学研究院

  • 关于我们
    • 院长致辞
    • 理事会
    • 协作机构
    • 参观来访
  • 人员
    • 管理层
    • 科研人员
    • 博士后
    • 来访学者
    • 行政团队
    • 学术支持
  • 学术研究
    • 研究团队
    • 公开课
    • 讨论班
  • 招生招聘
    • 教研人员
    • 博士后
    • 学生
  • 会议
    • 学术会议
    • 工作坊
    • 论坛
  • 学院生活
    • 住宿
    • 交通
    • 配套设施
    • 周边旅游
  • 新闻
    • 新闻动态
    • 通知公告
    • 资料下载
关于我们
院长致辞
理事会
协作机构
参观来访
人员
管理层
科研人员
博士后
来访学者
行政团队
学术支持
学术研究
研究团队
公开课
讨论班
招生招聘
教研人员
博士后
学生
会议
学术会议
工作坊
论坛
学院生活
住宿
交通
配套设施
周边旅游
新闻
新闻动态
通知公告
资料下载
清华大学 "求真书院"
清华大学丘成桐数学科学中心
清华三亚国际数学论坛
上海数学与交叉学科研究院
河套数学与交叉学科研究院
BIMSA > 12 x 2 Lectures on Deep Learning, Geometry, Statistics and Statistical Mechanics
12 x 2 Lectures on Deep Learning, Geometry, Statistics and Statistical Mechanics
Modern theoretical approaches to deep learning draw heavily on ideas from geometry, statistical physics, and the theory of interacting dynamical systems. At the same time, many classical concepts from statistical learning theory such as generalization, bias–variance tradeoffs, regularization and kernel methods remain central to understanding neural networks. This lecture series presents a unified view of these perspectives. We discuss how deep learning models can be understood as high-dimensional dynamical systems, how training dynamics lead to kernel limits and mean-field descriptions, and how geometric principles such as symmetry and equivariance guide modern architectures.

The course consists of 12 weeks with two lectures per week. Each week focuses on a distinct topic and is designed to be as self-contained as possible. The first lecture provides a conceptual overview of the main theoretical ideas, while the second lecture focuses on practical implementation in JAX.
Professor Lars Aake Andersson
讲师
沙伊莱什·拉尔
日期
2026年03月09日 至 07月03日
位置
Weekday Time Venue Online ID Password
周一,周五 13:30 - 15:05 A3-2a-201 ZOOM 04 482 240 1589 BIMSA
修课要求
A background in geometry, statistical physics, quantum field theory, or mathematical physics, broadly construed. Alternatively, a background in deep learning and/or statistical learning theory and comfort with linear algebra, probability, and calculus.
课程大纲
The lectures are organized into 12x2, with broader organizing themes.

STATISTICAL LEARNING FOUNDATIONS
1. Statistical Learning: Learning not memorization, Bias, Variance and Regularization. Learning in high dimensions.
2. Generalized Linear Models: Regression and Classification, Over-fitting, regularization

PRIMER ON NEURAL NETWORKS
3. Multi-layer Perceptrons (Fully Connected Neural Networks): Universal Approximation, Backpropagation, Depth
4. Gradient-Based Optimization: stochastic gradients, improving gradient descent, deep learning dynamics, gradient flows

THEORY OF LARGE NEURAL NETWORKS
5. The Neural Tangent Kernel: Training dynamics under gradient descent, lazy learning, escaping lazy learning
6. Mean Field Theory of Neural Networks: Connections to Optimal Transport, no lazy learning
7. Deep Equilibrium Models: Residual Learning, Infinite depth

GEOMETRY AND SYMMETRY AS GUIDING PRINCIPLES IN DEEP LEARNING
8. Convolutional Neural Networks: Image data, Convolutions, Padding and Pooling
9. Geometric Deep Learning: Abstracting from ConvNets, Symmetry, Invariance, Equivariance
10. Deep Learning on Graphs and Sequences: Graph Neural Networks, Message Passing and Self Attention

GENERATIVE MODELS
11. Generative Models: Adversarial Networks (GANs), Wasserstein GANs, Variational Auto-Encoders

STATISTICAL PHYSICS OF DEEP LEARNING
12. Statistical Physics of Deep Learning: Mean fields revisited, Energy landscapes, phase transitions in learning
听众
Advanced Undergraduate , Graduate , 博士后 , Researcher
视频公开
不公开
笔记公开
公开
语言
英文
讲师介绍
Shailesh Lal于Harish Chandra研究所获得博士学位。他的研究兴趣是机器学习在弦理论和数学物理中的应用,弦理论中的黑洞和higher-spin holography。
北京雁栖湖应用数学研究院
CONTACT

No. 544, Hefangkou Village Huaibei Town, Huairou District Beijing 101408

北京市怀柔区 河防口村544号
北京雁栖湖应用数学研究院 101408

Tel. 010-60661855 Tel. 010-60661855
Email. administration@bimsa.cn

版权所有 © 北京雁栖湖应用数学研究院

京ICP备2022029550号-1

京公网安备11011602001060 京公网安备11011602001060