I am an Assistant Professor in the Department of Statistics and Data Science at Yale University. My research interests lie at the intersection of mathematical statistics, probability theory, computational algorithms, and applications in genetics and computational biology.

Prior to Yale, I received my Ph.D. in 2018 from Stanford University, co-advised by Andrea Montanari and Iain Johnstone. From 2011 to 2013, I worked at D. E. Shaw Research, developing statistical and software tools for molecular dynamics simulations of biomolecules.

CV

Postdoc opportunity: A postdoc position is available for an appointment of up to two years, starting Fall 2023. To apply, please email me a CV and a link to one, or at most two, representative publications. Applications will be reviewed starting December 20, 2022 and on a rolling basis thereafter.

The ideal candidate should have strong background in a topic at the intersection of probability theory and statistics or machine learning. Possible topics include (but are not limited to): random matrix theory, random graphs and networks, statistical physics, dynamics of learning algorithms, sampling in high dimensions, optimal transport.

recent/representative papers

(Empirical) Bayes PCA, Approximate Message Passing algorithms, and variational inference
Universality of Approximate Message Passing algorithms and tensor networks (w/ Tianhao Wang, Xinyi Zhong)
TAP equations for orthogonally invariant spin glasses at high temperature (w/ Yufan Li, Subhabrata Sen)
Approximate Message Passing for orthogonally invariant ensembles: Multivariate non-linearities and spectral initialization (w/ Xinyi Zhong, Tianhao Wang)
Local convexity of the TAP free energy and AMP convergence for Z2-synchronization (w/ Michael Celentano, Song Mei)
The replica-symmetric free energy for Ising spin glasses with orthogonally invariant couplings (w/ Yihong Wu)
Approximate Message Passing algorithms for rotationally invariant matrices
Empirical Bayes PCA in high dimensions (w/ Xinyi Zhong, Chang Su)

We explore a set of related questions concerning high-dimensional variational Bayesian inference, mean-field theory and AMP algorithms for rotationally invariant models, and their applications to empirical Bayes procedures for PCA and dimensionality reduction.

Group orbit estimation and cryo-electron microscopy
Maximum likelihood for high-noise group orbit estimation and single-particle cryo-EM (w/ Roy Lederman, Yi Sun, Tianhao Wang, Sheng Xu)
Likelihood landscape and maximum likelihood estimation for the discrete orbit recovery model (w/ Yi Sun, Tianhao Wang, Yihong Wu)

We study estimation in the group orbit recovery model and applications to molecular structure determination via cryo-EM. A particular focus is on the high-noise regime, where our results connect properties of the Fisher information matrix and log-likelihood optimization landscape to the structure of the invariant algebra of the rotational group.

Random matrix theory in statistics and machine learning
Spectra of the Conjugate Kernel and Neural Tangent Kernel for linear-width neural networks (w/ Zhichao Wang)
Spectral graph matching and regularized quadratic relaxations I & II (w/ Cheng Mao, Yihong Wu, Jiaming Xu)
Principal components in linear mixed models with general bulk (w/ Yi Sun, Zhichao Wang)

We apply techniques of random matrix theory and free probability theory to study spectral phenomena in various statistics and machine learning applications. Our results characterize the spectral distributions of neural network kernel matrices, develop and analyze spectral algorithms for aligning correlated random graphs, and establish BBP-type characterizations of outlier eigenvalues and eigenvectors in mixed-effects linear models.

contact

Department of Statistics and Data Science
24 Hillhouse Avenue
New Haven, CT 06511
zhou.fan@yale.edu