Zhou Fan, Yale University, Fall 2024
Non-asymptotic methods in high-dimensional probability that find common use in applications across statistics, computer science, data science, and engineering. Topics include tail bounds for i.i.d. sums and martingale differences, concentration inequalities for non-linear functions, matrix concentration, and suprema of stochastic processes.
Prerequisites: S&DS 351b/551b or S&DS 400/600 (may be taken concurrently) or permission of instructor.
Approximately weekly, due Wednesdays 2pm on Gradescope. Homework assignments will constitute 100% of your course grade. Late homeworks will not be accepted. Your lowest homework grade will be dropped.
You are encouraged to work on homework problems with your classmates, but you must write the solutions yourself. Please indicate at the top of your assignment the names of your collaborators.
High-Dimensional Probability: An Introduction with Applications in Data Science, Roman Vershynin
Concentration Inequalities: A Nonasymptotic Theory of Independence, Stephane Boucheron, Gabor Lugosi, Pascal Massart
Probability in High Dimension, Ramon van Handel