马尔可夫链
Background: The modern theory of Markov chain mixing is the result of the convergence, in the 1980's and 1990's, of several threads:
For statistical physicists, Markov chains become useful in Monte Carlo simulation The mixing time determines the running time for simulation. Deep connections were found between rapid mixing and spatial properties of spin systems. In theoretical computer science, Markov chains play a key role in sampling and approximate counting algorithms.
At the same time, mathematicians were intensively studying random walks on groups. Both spectral methods and probabilistic techniques, such as coupling, played important roles. Ingenious constructions of expander graphs (on which random walks mix especially fast) were found. The connection between eigenvalues and expansion properties was first discovered in differential geometry, but then became central to the study of Markov chain mixing.
This term we will be using Piazza for class discussion. The system is highly catered to getting you help fast and efficiently from classmates, the TA, and myself. Rather than emailing questions to the teaching staff, I encourage you to post your questions on Piazza. If you have any problems or feedback for the developers, email team@piazza.com.
Find our class page at: https://piazza.com/bimsa/fall2022/markov/home
For statistical physicists, Markov chains become useful in Monte Carlo simulation The mixing time determines the running time for simulation. Deep connections were found between rapid mixing and spatial properties of spin systems. In theoretical computer science, Markov chains play a key role in sampling and approximate counting algorithms.
At the same time, mathematicians were intensively studying random walks on groups. Both spectral methods and probabilistic techniques, such as coupling, played important roles. Ingenious constructions of expander graphs (on which random walks mix especially fast) were found. The connection between eigenvalues and expansion properties was first discovered in differential geometry, but then became central to the study of Markov chain mixing.
This term we will be using Piazza for class discussion. The system is highly catered to getting you help fast and efficiently from classmates, the TA, and myself. Rather than emailing questions to the teaching staff, I encourage you to post your questions on Piazza. If you have any problems or feedback for the developers, email team@piazza.com.
Find our class page at: https://piazza.com/bimsa/fall2022/markov/home
讲师
日期
2022年10月12日 至 2023年01月06日
网站
修课要求
Undergraduate Probability and Linear Algebra, some measure theoretic probability: Conditional expectation, Laws of large numbers. The first chapter of Durrett’s graduate text or taking the course “Probability 1” given by Prof. Hao Wu at Tsinghua will provide ample background. That course can be taken in parallel with this course.
The textbook for the course is “Markov chains and mixing times”, second edition, PDF is available from https://www.yuval-peres-books.com/markov-chains-and-mixing-times/
In the last part of the course, we will see new material that is not in that book, and explore open problems and directions of current research.
The textbook for the course is “Markov chains and mixing times”, second edition, PDF is available from https://www.yuval-peres-books.com/markov-chains-and-mixing-times/
In the last part of the course, we will see new material that is not in that book, and explore open problems and directions of current research.
课程大纲
Week 1: Basic definitions and examples
1.1. Why Markov Chains?
1.2. Random Mapping Representation
1.3. Irreducibility and Aperiodicity
1.4. Random Walks on Graphs
1.5. Stationary Distributions
1.6. Reversibility and Time Reversals
1.7. Gambler's Ruin
1.8. Coupon Collecting
1.9 The cycle and the Hypercube
1.10 The Polya Urn Model
Week 2: Mixing, coupling and stationary times
2.1. Total Variation Distance and coupling
2.2. The Convergence Theorem
2.3. Mixing in total variation
2.4. Uniform mixing and mixing in other norms
2.5. Grand Couplings
2.6. Top-to-Random Shuffle
2.7. Stationary Times
2.8. Strong Stationary Times and separation distance
2.9. Stationary Times and Cesaro Mixing Time
2.10. Optimal strong stationary times
Week 3: Lower Bounds and the symmetric group
3.1. Counting and Diameter Bounds
3.2. Bottleneck Ratio
3.3. Distinguishing Statistics
3.4. The Symmetric Group
3.5. Random Transpositions and adjacent transpositions
3.6. Riffle shuffles
Week 4: Random Walks, electrical networks and hitting times
4.1. Networks and Reversible Markov Chains
4.2. Harmonic Functions
4.3. Voltages and Current Flows
4.4. Computing effective resistance
4.5. Random Target Times
4.6. Commute Time
4.7. Hitting Times on Trees
4.8. Hitting Times for Eulerian Graphs
4.9. Hitting Times for the Torus
Week 5: From hitting times to Cover Times
5.1. Bounding Mixing Times via Hitting Times
5.2. The Matthews Method
5.3. Spanning Tree Bound for Cover Time
5.4. The Gaussian Free field
Week 6: Eigenvalues
6.1. The Spectral Representation of a Reversible Transition Matrix
6.2. The Relaxation Time
6.3. Eigenvalues and Eigenfunctions of Some Simple Random Walks
6.4. Spectral Formula for the Target Time
6.5. Time Averages approximate space averages: MCMC
6.6. Bounds on Spectral Gap via Contractions
Week 7: Expansion of graphs and networks
7.1. The Dirichlet Form and the Bottleneck Ratio
7.2. Comparison of Markov Chains
7.3. Expander Graphs
Week 8: The Transportation Metric and Path Coupling
8.1. Path Coupling
8.2. Rapid Mixing for Colorings
8.3. Approximate Counting
8.4. Curvature of Markov chains
Week 9: Simulation and perfect sampling
9.1. Monte Carlo sampling
9.2. unbiasing random bits
9.3. Glauber and Metropolis dynamics
9.4. Coupling from the past
9.5. sampling uniform spanning trees
Week 10: The Ising Model
10.1. Fast Mixing at High Temperature
10.2. The Complete Graph
10.3. The Cycle
10.4. The Tree
10.5. Block Dynamics
10.6. Lower Bound for Ising on Square
Week 11: The Cutoff Phenomenon and continuous time chains
11.1. The hypercube versus the cycle
11.2. The product condition and the Aldous-Pak examples
11.3. Equivalence of continuous time chains and lazy chains.
11.4. birth-death chains and cutoff on trees
11.5. Ramanujan graphs
Week 12: Research directions
12.1. Universal lower bound for Ising, open for Potts
12.2. Swendsen -Wang Dynamics: the exponent quarter conjecture
12.3. Monotone subsets of the cube: The Ding-Mossel Theorem
12.4. Random walks on invertible matrices modulo 2
12.5. Random walk on percolation clusters and dynamical percolation
1.1. Why Markov Chains?
1.2. Random Mapping Representation
1.3. Irreducibility and Aperiodicity
1.4. Random Walks on Graphs
1.5. Stationary Distributions
1.6. Reversibility and Time Reversals
1.7. Gambler's Ruin
1.8. Coupon Collecting
1.9 The cycle and the Hypercube
1.10 The Polya Urn Model
Week 2: Mixing, coupling and stationary times
2.1. Total Variation Distance and coupling
2.2. The Convergence Theorem
2.3. Mixing in total variation
2.4. Uniform mixing and mixing in other norms
2.5. Grand Couplings
2.6. Top-to-Random Shuffle
2.7. Stationary Times
2.8. Strong Stationary Times and separation distance
2.9. Stationary Times and Cesaro Mixing Time
2.10. Optimal strong stationary times
Week 3: Lower Bounds and the symmetric group
3.1. Counting and Diameter Bounds
3.2. Bottleneck Ratio
3.3. Distinguishing Statistics
3.4. The Symmetric Group
3.5. Random Transpositions and adjacent transpositions
3.6. Riffle shuffles
Week 4: Random Walks, electrical networks and hitting times
4.1. Networks and Reversible Markov Chains
4.2. Harmonic Functions
4.3. Voltages and Current Flows
4.4. Computing effective resistance
4.5. Random Target Times
4.6. Commute Time
4.7. Hitting Times on Trees
4.8. Hitting Times for Eulerian Graphs
4.9. Hitting Times for the Torus
Week 5: From hitting times to Cover Times
5.1. Bounding Mixing Times via Hitting Times
5.2. The Matthews Method
5.3. Spanning Tree Bound for Cover Time
5.4. The Gaussian Free field
Week 6: Eigenvalues
6.1. The Spectral Representation of a Reversible Transition Matrix
6.2. The Relaxation Time
6.3. Eigenvalues and Eigenfunctions of Some Simple Random Walks
6.4. Spectral Formula for the Target Time
6.5. Time Averages approximate space averages: MCMC
6.6. Bounds on Spectral Gap via Contractions
Week 7: Expansion of graphs and networks
7.1. The Dirichlet Form and the Bottleneck Ratio
7.2. Comparison of Markov Chains
7.3. Expander Graphs
Week 8: The Transportation Metric and Path Coupling
8.1. Path Coupling
8.2. Rapid Mixing for Colorings
8.3. Approximate Counting
8.4. Curvature of Markov chains
Week 9: Simulation and perfect sampling
9.1. Monte Carlo sampling
9.2. unbiasing random bits
9.3. Glauber and Metropolis dynamics
9.4. Coupling from the past
9.5. sampling uniform spanning trees
Week 10: The Ising Model
10.1. Fast Mixing at High Temperature
10.2. The Complete Graph
10.3. The Cycle
10.4. The Tree
10.5. Block Dynamics
10.6. Lower Bound for Ising on Square
Week 11: The Cutoff Phenomenon and continuous time chains
11.1. The hypercube versus the cycle
11.2. The product condition and the Aldous-Pak examples
11.3. Equivalence of continuous time chains and lazy chains.
11.4. birth-death chains and cutoff on trees
11.5. Ramanujan graphs
Week 12: Research directions
12.1. Universal lower bound for Ising, open for Potts
12.2. Swendsen -Wang Dynamics: the exponent quarter conjecture
12.3. Monotone subsets of the cube: The Ding-Mossel Theorem
12.4. Random walks on invertible matrices modulo 2
12.5. Random walk on percolation clusters and dynamical percolation
参考资料
https://www.yuval-peres-books.com/markov-chains-and-mixing-times/
However, In the second part of the course we will see new material that is not in that book, and explore open problems and directions of current research.
However, In the second part of the course we will see new material that is not in that book, and explore open problems and directions of current research.
听众
Graduate
视频公开
公开
笔记公开
公开
语言
英文
讲师介绍
Yuval Peres 1990年于耶路撒冷希伯来大学博士毕业,并先后于斯坦福大学和耶鲁大学任职博士后。此后,他在美国加州大学伯克利分校和耶路撒冷希伯来大学担任数学和统计学教授,并任微软公司首席研究员。Peres在概率论的大部分领域总共发表过超过350篇论文,包括随机游走,布朗运动,渗流和随机图。与他人著有多本专题专著:《概率与分析中的分形》,《布朗运动》,《高斯解析函数的零点与行列式点过程》,《马尔可夫链与混合时间》,《树图与网络中的概率论》,《博弈论》,并被美国数学会和剑桥大学出版社出版。专著涉及马可夫链、概率图、博弈论和布朗运动等方向,具体信息可以在以下网址查找:https://www.yuval-peres-books.com/ . 他的报告可在以下网址查找:https://yuval-peres-presentations.com/。
Peres是Rollo Davidson奖和Loeve奖得主,是2002年北京国际数学家大会、2008年欧洲数学家年会、2017年美国数学家年会邀请报告人,并于2016年当选美国科学院院士。他指导过21名博士,包括Elchanan Mossel (美国麻省理工大学教授, 美国数学家学会会士), 丁剑 (北京大学, 国际华人数学家大会金奖、Rollo Davidson奖得主), 以及Balint Virag和Gabor Pete (Rollo Davidson奖得主).