Models of Sequential Data
In this course, we discuss the forefront of modern research in learning from sequence data. The course takes a walk from the basics of sequence processing to current deep learning approaches. We aim at covering both fundamental and modern advances in this area not commonly discussed in undergraduate or graduate Machine Learning and Deep Learning classes.
Lecturer
Alexey Zaytsev
Date
1st November, 2022 ~ 10th January, 2023
Website
Prerequisite
Probability theory, Machine learning
Syllabus
Block 1: Classic approach
Lecture 1. Time Series as tabular data. Intro to Machine learning
Lecture 2. Stationarity of time series and ARIMA models
Lecture 3. Stationarity diagnostics
Block 2: Neural networks before attention
Lecture 4. Representation learning concept. Recurrent Neural Networks. LSTM. GRU. Other architectures
Lecture 5. Convolution NN for sequential data
Lecture 6. Opening blackbox of a neural network: sensitivity analysis
Block 3: Attention mechanism
Lecture 7. Attention Mechanism
Lecture 8. Transformers
Lecture 9. Efficient Transformers
Block 4: Additional topics
Lecture 10. Adversarial Attacks and stability of deep learning models
Lecture 11. Distances between sequences. DTW + soft DTW. Metric learning
Lecture 12. Self-supervised for sequential data I
Lecture 13. Change point detection
Block 5: Temporal point processes
Lecture 14. Basic. Poisson random process. Hawkes process. Maximum likelihood estimation for them
Lecture 15. Event sequences as temporal point processes
Lecture 16. Neural network models for event sequences
Lecture 17. Processing of Spatio-temporal data
Lecture 1. Time Series as tabular data. Intro to Machine learning
Lecture 2. Stationarity of time series and ARIMA models
Lecture 3. Stationarity diagnostics
Block 2: Neural networks before attention
Lecture 4. Representation learning concept. Recurrent Neural Networks. LSTM. GRU. Other architectures
Lecture 5. Convolution NN for sequential data
Lecture 6. Opening blackbox of a neural network: sensitivity analysis
Block 3: Attention mechanism
Lecture 7. Attention Mechanism
Lecture 8. Transformers
Lecture 9. Efficient Transformers
Block 4: Additional topics
Lecture 10. Adversarial Attacks and stability of deep learning models
Lecture 11. Distances between sequences. DTW + soft DTW. Metric learning
Lecture 12. Self-supervised for sequential data I
Lecture 13. Change point detection
Block 5: Temporal point processes
Lecture 14. Basic. Poisson random process. Hawkes process. Maximum likelihood estimation for them
Lecture 15. Event sequences as temporal point processes
Lecture 16. Neural network models for event sequences
Lecture 17. Processing of Spatio-temporal data
Reference
Tools: Notebook with access to colab.google.com
Audience
Graduate
Video Public
Yes
Notes Public
Yes
Language
English
Lecturer Intro
Alexey has deep expertise in machine learning and processing of sequential data. He publishes at top venues, including KDD, ACM Multimedia and AISTATS. Industrial applications of his results are now in service at companies Airbus, Porsche and Saudi Aramco among others.