Previously I was at Stanford University as a Motwani Postdoctoral Fellow in the Department of Computer Science. Right before, I was a research fellow in UC Berkeley's Simons Institute in the Program on Computational Complexity of Statistical Inference. I received my PHD in Mathematics and Statistics from MIT, where I was coadvised by
Ankur Moitra and Elchanan Mossel, and before that I received my undergraduate degree in Mathematics at Princeton University.
I am interested in machine learning, theoretical computer science, and high-dimensional probability and statistics. Some recent topics of interest include generalization theory, algorithms for learning and inference in graphical models, sampling algorithms and generative modeling, related aspects of statistical physics, etc.
Please feel free to reach out --- unfortunately, I may not be able to reply to every email. Email: [first initial][last name]@uchicago.edu Office: Room 303, 5460 S. University Avenue
Constructive Approximation under Carleman's Condition, with Applications to Smoothed Analysis, joint with Beining Wu.
Symposium on Theory of Computing (STOC) 2026, To Appear.
[arXiv:2512.04371]
Efficiently learning and sampling multimodal distributions with data-based initialization, joint with Holden Lee and Thuy-Duong Vuong. Conference on Learning Theory (COLT) 2025. [arXiv:2411.09117]
Lasso with Latents: Efficient Estimation, Covariate Rescaling, and Computational-Statistical Gaps, joint with Jon Kelner, Raghu Meka, and Dhruv Rohatgi. Conference on Learning Theory (COLT) 2024. [arXiv:2402.15409]
Sampling Multimodal Distributions with the Vanilla Score: Benefits of Data-Based Initialization, joint with Thuy-Duong Vuong. International Conference on Learning Representations (ICLR), 2024.[arXiv:2310.01762]
Feature Adaptation for Sparse Linear Regression, joint with Jon Kelner, Raghu Meka, and Dhruv Rohatgi. Advances in Neural Information Processing Systems (NeurIPS) 2023 (Spotlight Presentation). [arXiv:2305.16892]
Statistical Efficiency of Score Matching: The View from Isoperimetry, joint with Alexander Heckett and Andrej Risteski. Abridged version in NeurIPS 2022 Workshop on Score-Based Methods (Oral Presentation). International Conference on Learning Representations (ICLR) 2023 (Oral Equivalent, Top 5% of Papers). [arXiv:2210.00726][slides]
Distributional Hardness Against Preconditioned Lasso via Erasure-Robust Designs, joint with Jon Kelner, Raghu Meka, and Dhruv Rohatgi. Advances in Neural Information Processing Systems (NeurIPS) 2022. [arXiv:2203.02823]
Reconstruction on Trees and Low-Degree Polynomials, joint with Elchanan Mossel. Advances in Neural Information Processing Systems (NeurIPS) 2022 (Oral Presentation). [arXiv:2109.06915]
Variational autoencoders in the presence of low-dimensional data: landscape and implicit bias, joint with Viraj Mehta, Andrej Risteski, and Chenghui Zhou. International Conference on Learning Representations (ICLR) 2022. [arXiv:2112.06868]
Entropic Independence II: Optimal Sampling and Concentration via Restricted Modified Log-Sobolev Inequalities, joint with Nima Anari, Vishesh Jain, Huy Tuan Pham, and Thuy-Duong Vuong. Symposium on Theory of Computing (STOC) 2022 (extended abstract merged w/ Ent. Ind. I). [arXiv:2111.03247].
Representational aspects of depth and conditioning in normalizing flows, joint with Viraj Mehta and Andrej Risteski.
International Conference on Machine Learning (ICML) 2021. [arXiv:2010.01155].
From Boltzmann Machines to Neural Networks and Back Again, joint with Surbhi Goel and Adam Klivans. Advances in Neural Information Processing Systems (NeurIPS) 2020.
[arXiv:2007.12815]
Classification Under Misspecification: Halfspaces, Generalized Linear Models, and Connections to Evolvability, joint with Sitan Chen, Ankur Moitra, and Morris Yau. Advances in Neural Information Processing Systems (NeurIPS) 2020 (Spotlight Presentation). [arXiv:2006.04787][Video]
Learning Some Popular Gaussian Graphical Models without Condition Number Bounds, joint with Jonathan Kelner, Raghu Meka, and Ankur Moitra. Advances in Neural Information Processing Systems (NeurIPS) 2020 (Spotlight Presentation). [arXiv:1905.01282]. [Video]
Fast Convergence of Belief Propagation to Global Optima: Beyond Correlation Decay. Advances in Neural Information Processing Systems (NeurIPS) 2019 (Spotlight presentation).
[arXiv:1905.09992]
How Many Subpopulations is Too Many? Exponential Lower Bounds for Inferring Population Histories, joint with Younhun Kim, Ankur Moitra,
Elchanan Mossel, and
Govind Ramnarayan. International Conference on Research in Computational Molecular Biology (RECOMB) 2019; Journal of Computational Biology (JCB) Special Issue. [arXiv:1811.03177]
Mean-field approximation, convex hierarchies, and the optimality of correlation rounding: a unified perspective, joint with Vishesh Jain and Andrej Risteski. Symposium on Theory of Computing (STOC) 2019. [arXiv:1808.07226]
The Comparative Power of ReLU Networks and Polynomial Kernels in the Presence of Sparse Latent Structure, joint with Andrej Risteski. International Conference on Learning Representations (ICLR) 2019. [arXiv:1805.11405] [OpenReview]
Information theoretic properties of Markov random fields, and their algorithmic applications, joint with Linus Hamilton and Ankur Moitra. Advances in Neural Information Processing Systems (NeurIPS) 2017. [arXiv:1705.11107]
Busy Time Scheduling on a Bounded Number of Machines, joint with Samir Khuller. Algorithm and Data Structures Symposium (WADS) 2017. (Full Version, slides)
Optimal batch schedules for parallel machines, joint with Samir Khuller. Algorithm and Data Structures Symposium (WADS) 2013.
(Full version)
Generalization theory and approximation
Learning algorithms
Variational inference, sampling, and generative modeling
Scheduling algorithms
In most cases, authors are listed in alphabetical order, following the convention in mathematics and theoretical computer science.
Miscellaneous service: SODA 2025 PC Member, STOC 2024 PC Member, NeurIPS 2023-2024 Area Chair, COLT 2021,2022,2024 PC Member, ALT 2023 PC Member. Refereeing for many journals and conferences (further information available upon request).