Analytical Methods for Neural Networks
Neural Networks and deep learning have seen tremendous success in various machine learning tasks, spread across computer vision, natural languages and even modern mathematical physics. In this course we will take a complementary direction. We will describe how tools from theoretical physics, especially quantum field theory, have been brought to bear on neural networks. To provide a frame of reference for the theory, we will also describe key elements of practical neural network training and design.
Lecturer
Date
18th September ~ 9th November, 2023
Location
Weekday | Time | Venue | Online | ID | Password |
---|---|---|---|---|---|
Monday,Thursday | 08:00 - 11:25 | A3-2a-201 | ZOOM 02 | 518 868 7656 | BIMSA |
Prerequisite
Knowledge of quantum field theory is helpful, but not a prerequisite for the course.
Syllabus
1. Neural Networks in Practice:
a. Perceptron and Linear Models
b. Deep Neural Networks
c. Universal Approximation
d. Training Deep Neural Networks
2. Neural Networks in Theory:
a. Ensembles of Neural Networks
b. Deep Linear Networks as toy models for Neural Networks
c. Effective Theory of Neural Networks at Initialization
d. Training Neural Networks; The Neural Tangent Kernel and beyond
a. Perceptron and Linear Models
b. Deep Neural Networks
c. Universal Approximation
d. Training Deep Neural Networks
2. Neural Networks in Theory:
a. Ensembles of Neural Networks
b. Deep Linear Networks as toy models for Neural Networks
c. Effective Theory of Neural Networks at Initialization
d. Training Neural Networks; The Neural Tangent Kernel and beyond
Video Public
Yes
Notes Public
Yes
Lecturer Intro
Dr Shailesh Lal received his PhD from the Harish-Chandra Research Institute. His research interests are applications of machine learning to string theory and mathematical physics, black holes in string theory and higher-spin holography.