Analytical Methods for Neural Networks
Neural Networks and deep learning have seen tremendous success in various machine learning tasks, spread across computer vision, natural languages and even modern mathematical physics. In this course we will take a complementary direction. We will describe how tools from theoretical physics, especially quantum field theory, have been brought to bear on neural networks. To provide a frame of reference for the theory, we will also describe key elements of practical neural network training and design.
讲师
日期
2023年09月18日 至 11月09日
位置
Weekday | Time | Venue | Online | ID | Password |
---|---|---|---|---|---|
周一,周四 | 08:00 - 11:25 | A3-2a-201 | ZOOM 02 | 518 868 7656 | BIMSA |
修课要求
Knowledge of quantum field theory is helpful, but not a prerequisite for the course.
课程大纲
1. Neural Networks in Practice:
a. Perceptron and Linear Models
b. Deep Neural Networks
c. Universal Approximation
d. Training Deep Neural Networks
2. Neural Networks in Theory:
a. Ensembles of Neural Networks
b. Deep Linear Networks as toy models for Neural Networks
c. Effective Theory of Neural Networks at Initialization
d. Training Neural Networks; The Neural Tangent Kernel and beyond
a. Perceptron and Linear Models
b. Deep Neural Networks
c. Universal Approximation
d. Training Deep Neural Networks
2. Neural Networks in Theory:
a. Ensembles of Neural Networks
b. Deep Linear Networks as toy models for Neural Networks
c. Effective Theory of Neural Networks at Initialization
d. Training Neural Networks; The Neural Tangent Kernel and beyond
视频公开
公开
笔记公开
公开
讲师介绍
Shailesh Lal于Harish Chandra研究所获得博士学位。他的研究兴趣是机器学习在弦理论和数学物理中的应用,弦理论中的黑洞和higher-spin holography。