Beijing Institute of Mathematical Sciences and Applications Beijing Institute of Mathematical Sciences and Applications

  • About
    • President
    • Governance
    • Partner Institutions
    • Visit
  • People
    • Management
    • Faculty
    • Postdocs
    • Visiting Scholars
    • Staff
  • Research
    • Research Groups
    • Courses
    • Seminars
  • Join Us
    • Faculty
    • Postdocs
    • Students
  • Events
    • Conferences
    • Workshops
    • Forum
  • Life @ BIMSA
    • Accommodation
    • Transportation
    • Facilities
    • Tour
  • News
    • News
    • Announcement
    • Downloads
About
President
Governance
Partner Institutions
Visit
People
Management
Faculty
Postdocs
Visiting Scholars
Staff
Research
Research Groups
Courses
Seminars
Join Us
Faculty
Postdocs
Students
Events
Conferences
Workshops
Forum
Life @ BIMSA
Accommodation
Transportation
Facilities
Tour
News
News
Announcement
Downloads
Qiuzhen College, Tsinghua University
Yau Mathematical Sciences Center, Tsinghua University (YMSC)
Tsinghua Sanya International  Mathematics Forum (TSIMF)
Shanghai Institute for Mathematics and  Interdisciplinary Sciences (SIMIS)
BIMSA > Probabilistic machine learning \(ICBS\)
Probabilistic machine learning
Probabilistic approach in machine and deep learning leads to principled solutions. It provides explainable decisions and new ways for improving of existing approaches. Bayesian machine learning consists of probabilistic approaches that rely on Bayes formula. It can help in numerous applications and has beautiful mathematical concepts behind. In this course, I will describe the foundations of Bayesian machine learning and how it works as a part of deep learning framework.
Lecturer
Alexey Zaytsev
Date
12th October, 2023 ~ 17th January, 2024
Location
Weekday Time Venue Online ID Password
Wednesday,Thursday 17:05 - 18:40 A3-2-301 ZOOM 02 518 868 7656 BIMSA
Prerequisite
Probability theory, Mathematical statistics, Machine learning
Syllabus
Block 1: Basics of Bayesian approach
Lecture 1. The problem of the statistical estimation. Basics on probability distributions. Maximum likelihood approach. The concept of Bayesian inference
Lecture 2. Linear regression.
Lecture 3. Bayesian linear regression.
Lecture 4. Logistic regression. Bayesian logistic regression. Laplace approximation.
Lecture 5. Exponential family of distributions. Conjugate priors. Bayesian inference for exponential family. Optimality of exponential family.

Block 2: Approximate inference
Lecture 6. Sampling problem statement. Importance sampling
Lecture 7. Monte-Carlo sampling. MCMC. Metropolis–Hastings algorithm.
Lecture 8. Hamiltonian Monte-Carlo
Lecture 9. EM algorithm. Expectation propagation.
Lecture 10. Variational inference. ELBO lower bound.
Lecture 11. Variational inference in practice. Variance reduction

Block 3: Gaussian process models
Lecture 12. Gaussian process regression. Exact inference scheme. Connection to RKHS space
Lecture 13. Efficient Gaussian process regression. Fourier features. Nystrom approximation.
Lecture 14. Approximate Generalized Gaussian process models. Heteroscedasticity modeling.
Lecture 15. Risk estimation for Gaussian process regression. Parametric and non-parametric approaches.
Lecture 16. Active learning

Block 4: Bayesian neural networks
Lecture 17. Bayesian optimization
Lecture 18. Neural networks basics. Bayesian dropout. Optimization of neural networks
Lecture 19. Uncertainty estimation in machine learning: Bayesian and non-Bayesian methods
Lecture 20. Deep Gaussian process regression in embedding space
Lecture 21. Loss surfaces for deep neural networks.

Block 5: Probabilistic generative models
Lecture 22. Variational auto encoders
Lecture 23. Normalizing flows
Lecture 24. Diffusion processes 1
Lecture 25. Diffusion processes 2
Lecture 26. Diffusion processes 3

Block 6: Point processes
Lecture 27. Basics: Poisson processes, non-homogenous Poisson process. Maximum likelihood estimation
Lecture 28. Hawkes process. Deep Hawkes processes based on RNNs and Transformers.
Reference
The topic of the course is on the edge of the current advances in the field. We would provide a list of articles and books for each particular lecture. Good starting points are:
1. C. Bishop “Pattern Recognition and Machine learning”, 2006. Blocks 1 and 2.
2. C. Rasmussen “Gaussian processes for Machine learning”, 2005. Block 3

Tools: Notebook with access to colab.google.com
Audience
Graduate
Video Public
Yes
Notes Public
Yes
Language
English
Lecturer Intro
Alexey has deep expertise in machine learning and processing of sequential data. He publishes at top venues, including KDD, ACM Multimedia and AISTATS. Industrial applications of his results are now in service at companies Airbus, Porsche and Saudi Aramco among others.
Beijing Institute of Mathematical Sciences and Applications
CONTACT

No. 544, Hefangkou Village Huaibei Town, Huairou District Beijing 101408

北京市怀柔区 河防口村544号
北京雁栖湖应用数学研究院 101408

Tel. 010-60661855
Email. administration@bimsa.cn

Copyright © Beijing Institute of Mathematical Sciences and Applications

京ICP备2022029550号-1

京公网安备11011602001060 京公网安备11011602001060