序列数据模型
In this course, we discuss the forefront of modern research in learning from sequence data. The course takes a walk from the basics of sequence processing to current deep learning approaches. We aim at covering both fundamental and modern advances in this area not commonly discussed in undergraduate or graduate Machine Learning and Deep Learning classes.
讲师
Alexey Zaytsev
日期
2022年11月01日 至 2023年01月10日
网站
修课要求
Probability theory, Machine learning
课程大纲
Block 1: Classic approach
Lecture 1. Time Series as tabular data. Intro to Machine learning
Lecture 2. Stationarity of time series and ARIMA models
Lecture 3. Stationarity diagnostics
Block 2: Neural networks before attention
Lecture 4. Representation learning concept. Recurrent Neural Networks. LSTM. GRU. Other architectures
Lecture 5. Convolution NN for sequential data
Lecture 6. Opening blackbox of a neural network: sensitivity analysis
Block 3: Attention mechanism
Lecture 7. Attention Mechanism
Lecture 8. Transformers
Lecture 9. Efficient Transformers
Block 4: Additional topics
Lecture 10. Adversarial Attacks and stability of deep learning models
Lecture 11. Distances between sequences. DTW + soft DTW. Metric learning
Lecture 12. Self-supervised for sequential data I
Lecture 13. Change point detection
Block 5: Temporal point processes
Lecture 14. Basic. Poisson random process. Hawkes process. Maximum likelihood estimation for them
Lecture 15. Event sequences as temporal point processes
Lecture 16. Neural network models for event sequences
Lecture 17. Processing of Spatio-temporal data
Lecture 1. Time Series as tabular data. Intro to Machine learning
Lecture 2. Stationarity of time series and ARIMA models
Lecture 3. Stationarity diagnostics
Block 2: Neural networks before attention
Lecture 4. Representation learning concept. Recurrent Neural Networks. LSTM. GRU. Other architectures
Lecture 5. Convolution NN for sequential data
Lecture 6. Opening blackbox of a neural network: sensitivity analysis
Block 3: Attention mechanism
Lecture 7. Attention Mechanism
Lecture 8. Transformers
Lecture 9. Efficient Transformers
Block 4: Additional topics
Lecture 10. Adversarial Attacks and stability of deep learning models
Lecture 11. Distances between sequences. DTW + soft DTW. Metric learning
Lecture 12. Self-supervised for sequential data I
Lecture 13. Change point detection
Block 5: Temporal point processes
Lecture 14. Basic. Poisson random process. Hawkes process. Maximum likelihood estimation for them
Lecture 15. Event sequences as temporal point processes
Lecture 16. Neural network models for event sequences
Lecture 17. Processing of Spatio-temporal data
参考资料
Tools: Notebook with access to colab.google.com
听众
Graduate
视频公开
公开
笔记公开
公开
语言
英文
讲师介绍
Alexey has deep expertise in machine learning and processing of sequential data. He publishes at top venues, including KDD, ACM Multimedia and AISTATS. Industrial applications of his results are now in service at companies Airbus, Porsche and Saudi Aramco among others.