Zhou Fan, Yale University, Spring 2019
A seminar-style introduction to random matrix theory. Wigner matrices, sample covariance matrices, spiked models. Applications to principal components analysis, spectral algorithms on graphs and networks, and landscape analysis of non-convex optimization problems. Methods for non-invariant models that commonly arise in applications: moment method, concentration of measure, resolvents and Stieltjes transforms, free probability, Lindeberg exchange.
Prerequisites: Graduate-level probability theory
A small number of homework problems will be assigned during lectures. You may select any three problems to solve. Solutions must be type-written and are due on the last day of reading period, Wednesday May 1.
A typical homework problem may require you to prove an extension of a result from class or to implement an algorithm discussed in class and to explore its performance in simulation. You are encouraged to consult textbooks and other literature. Any such literature must be properly cited in a References section at the end of your write-up.
Written lecture notes will be posted on Canvas. Material in the first half of the course will be drawn from:
An Introduction to Random Matrices, Greg W. Anderson, Alice Guionnet, Ofer Zeitouni
Topics in Random Matrix Theory, Terence Tao
Material in the second half of the course will be based on research papers, which will also be posted on Canvas.