We are pleased to announce this second edition of GRaM, as an ICLR 2026 workshop. This year, we will have a focus on scale and simplicity. We open three tracks: paper tracks, blogposts track and a new competition track.
Deadlines
- Paper submission: February 5, 2026 (AoE)
- Blogpost submission: February 26, 2026 (AoE)
- Paper notification: March 1st, 2026 (AoE)
- Paper and blogpost camera-ready: March 11th, 2026 (AoE)
- Workshop dates: April 25-26, 2026 (exact date to be announced)
Call for Papers (Submit via OpenReview)
We welcome submissions to two complementary tracks. Page limits exclude references and appendix:
- Proceedings track: Full papers (up to 8 pages) presenting complete research results. Accepted papers will be published in the workshop proceedings at PMLR.
- Tiny paper track: Short papers (up to 4 pages) intended for early ideas, ongoing work, new perspectives, or contributions that benefit from a lighter review and discussion-oriented format.
Papers should be submitted on the dedicated Openreview submission page ICLR 2026 Workshop GRaM. Submissions require an OpenReview account (institutional email recommended).
Template files: To prepare your submission, please use the following template download link.
Call for blogposts (Submit a blog post)
As in previous editions, we also invite submissions to our blog post track. We welcome blog posts in the ICLR Distill format that aim to communicate ideas clearly and accessibly. Submission guidelines are available on the website: Call for Blog Posts. You can also explore blog posts from GRAM 2024 for examples.
Important information about the Tiny Papers
Since 2025, ICLR has discontinued the separate "Tiny Papers" track, and is instead requiring each workshop to accept short (3–5 pages in ICLR format, exact page length to be determined by each workshop) paper submissions, with an eye towards inclusion; see ICLR 2025 Call for Tiny Papers for a history of the ICLR tiny papers initiative. Authors of these papers will be earmarked for potential funding from ICLR, but need to submit a separate application for Financial Assistance that evaluates their eligibility. This application for Financial Assistance to attend ICLR 2026 will become available on ICLR 2026 at the beginning of February and close early March.
Motivation
Many real-world datasets have geometric structure, but most ML methods ignore such structure, and treat all inputs as plain vectors. GRaM is a workshop about grounding models in geometry, using ideas from group equivariance to non-Euclidean metrics, to build better, more interpretable representations and generative models.
An approach is geometrically grounded if it respects the geometric structure of the problem domain and supports geometric reasoning.
For this second edition, we aim to explore the relevance of geometric methods, particularly in the context of large models, focusing on the theme of scale and simplicity.
Topics
We solicit submissions that present theoretical research, methodologies, applications, insightful analysis, and even open problems, within the following topics (list not exhaustive):
- Preserving data geometry
- Preservation of symmetries; e.g., through equivariant operators.
- Geometric representation systems; e.g., encoding data in intrinsically structured forms via Clifford algebras or steerable vectors with Clebsch-Gordan products.
- Isometric latent mappings; e.g., learning latent representations of the data via pullback metrics.
- Inducing geometric structure
- Geometric priors; e.g., introducing curvature, symmetry, or topological constraints through explicit regularization.
- non-Euclidean generative models; e.g., extending diffusion models or flow matching models to non-Euclidean domains with a predefined metric.
- Metric-preserving embeddings; e.g., learning latent spaces where intrinsic geodesic distances are mapped to Euclidean ones.
- Geometry in theoretical analysis
- Data and latent geometry · Gaining insights on the data manifold, statistical manifold or the latent variables using geometric tools.
- Loss landscape geometry · Viewing parameters and their optimization trajectory as lying on a manifold, enabling analysis of curvature, critical points, and generalization.
- Theoretical frameworks · Using differential geometry, algebraic geometry, or group theory to provide a generalizing perspective on generation or representation learning.
- Open problems · Identifying and addressing unresolved questions and challenges that lie at the intersection of geometry and learning.
- Scale and Simplicity
- Geometry at scale · Does equivariance retain value in large-scale models?
- Redundancy and minimality · Evaluating when geometric structure is essential versus when simpler architectures suffice.
- Challenging assumptions · Reporting negative results or limitations of geometric methods to guide future development.
Organizers
Alison Pouplin
Sharvaree Vadgama
Erik Bekkers
Sékou-Oumar Kaba
Hannah Lawrence
Manuel Lecha
Elizabeth (Libby) Baker
Julian Suk
Robin Walters
Jakub Tomczak
Stefanie Jegelka