Sam Buchanan
Postdoctoral Scholar
University of California, Berkeley
[email protected]
Here is a more formal photo.
I am a postdoctoral scholar at the University of California, Berkeley. Previously, I was at TTIC, after finishing my PhD in Electrical Engineering at Columbia University.
My research focuses on efficiency in large model training. I work on optimization and model architecture improvements that make better use of data and compute as we scale up, spanning implementation, algorithm design, and mathematical foundations.
Recent Highlights
-
Book Release: We have released version 1.0 of our new fully-open-source book on representation learning with deep networks (link). It covers theoretical foundations from information theory to optimization as well as concrete applications such as transformers and contrastive learning. The book is full of new perspectives from our recent research—check out Chapters 3 and 6 for our take on diffusion, endorsed by the great Kevin Murphy!
-
Accepted Papers: Two papers accepted to NeurIPS 2025! One on a theoretical analysis of memorization and generalization in diffusion models (link), and one on building diffusion models with proximal operators, leading to fewer NFEs at sampling time (link). (Sep 2025)
Upcoming Events
- 3rd Conference on Parsimony and Learning: I am co-organizing the third Conference on Parsimony and Learning (CPAL). This year, the conference will be held in Tübingen, Germany. Please consider attending!