Welcome to the Data Science Colloquium of the ENS.
This colloquium is organized around data sciences in a broad sense with the goal of bringing together researchers with diverse backgrounds (including for instance mathematics, computer science, physics, chemistry and neuroscience) but a common interest in dealing with large scale or high dimensional data.
The colloquium is followed by an open buffet around which participants can meet and discuss collaborations.
These seminars are made possible by the support of the CFM-ENS Chair “Modèles et Sciences des Données”.
You can check the list of the next seminars below and the list of past seminars.
Videos of some of the past seminars are available online.
The colloquium is organized by:
9 April 2026, 12h45-13h45 (Paris time), room Salle de Conference IV (24, rue Lhomond).
Michael Chertkov (University of Arizona)
Title: Samples That Cooperate, Samples That Remember: Two Exactly Solvable Bridge Diffusions
Abstract: Diffusion-based generative models treat samples as independent and memoryless. I will show that relaxing each assumption leads to rich, exactly solvable physics — with no neural networks anywhere. Giving samples a present — coupling them through their evolving mean field — produces a McKean–Vlasov optimal transport problem whose self-consistent guidance is provably the linear interpolant between endpoint means, for arbitrary distributions and any interaction schedule; applied to building-fleet demand response, this saves 20%+ in actuation energy. Giving samples a past produces a continual-learning agent whose memory is a Bridge Diffusion and whose forgetting — arising from a single lossy temporal coarse-graining step — obeys a universal linear capacity law with a Shannon-like constant. Both constructions live in the world of Riccati equations, hyperbolic functions, and mixture linear algebra; the physics of the bridge — not the expressivity of a network — controls what is achievable.