
Stephan Mandt
505 posts

Stephan Mandt
@StephanMandt
AI Professor @UCIrvine | Associate Director, AI in Science Institute | Formerly @blei_lab, @Princeton | Chair @aistats_conf 2025 | AI Resident @ChanZuckerberg


We've known that diffusion models are theoretically very good lossy data compressors , but how can we actually implement this idea in practice? I discuss this and related topics in a new review article on diffusion-based generative compression arxiv.org/abs/2601.18932







How to model event sequences with real world variety: mixed data types, different lengths, …? Meet FlexTPP, a unified transformer framework with discrete & continuous heads for health care, complex annotations and more! NeurIPS spotlight, Fri 11am #2102! openreview.net/forum?id=MtwsR…



Made a pretty website for our ICLR 2025 work AstroCompress: neural compression for space telescopes + 320 GB of ML-ready astro image data. rithwiksud.github.io/astrocompress-… Has links to paper, data, code, Jupyter notebook, reviews, & ICLR video presentation.


Exciting news! Our paper "On the Challenges and Opportunities in Generative AI" has been accepted to TMLR 2025. 📄 arxiv.org/abs/2403.00025



In this #CVPR2025 edition of our community-building workshop series, we focus on supporting the growth of early-career researchers. Join us tomorrow (Jun 11) at 12:45 PM in Room 209 Schedule: sites.google.com/view/standoutc… We have an exciting lineup of invited talks and candid panels: @sarameghanbeery, @dimadamen, @jbhuang0604, @lealtaixe, @LerrelPinto, @lschmidt3, @shubhtuls, @gulvarol, @cvondrick, @sainingxie Co-organizing with @unnatjain2010, @ap229997, @georgiagkioxari, @akanazawa, and Lana Lazebnik @CVPR

ICML 25 paper on variational guidance for diffusion models accepted Happy to share that our diffusion model guidance paper with @farrinsofian, @kpandey008, @felixDrRelax, and @StephanMandt on casting control for guidance as variational inference with auxiliary variables was accepted at ICML 25. This is a collaboration within our AI residency at @cziscience with the @UCIrvine team around Stephan Mandt, and very exciting for a number of reasons: •We believe that unconditional diffusion models will be an important foundational building block of virtual cells (stay put for more), and guidance is the mechanism to steer them towards acting as virtual instruments running experiments. Improving such guidance mechanisms is a key capability we care about •Optimizing our auxiliary latent variables via variational inference has the flavor of a structured form of test-time compute with guaranteed improvements (tighter lower bounds!) as we invest more computation! Test time compute is huge in reasoning models, it will also increasingly be obvious as a component in diffusion models where we analogize that “posterior inference = search” and work through ways to simulate objects using our generative models with more accuracy and faithfulness •Along the way, the paper proves significant gains in common benchmarking tasks for guidance systems for common ML tasks as Farrin shows in her tweetorial Very excited about the collaboration and being able to bring together Stephan’s and my passion for approximate inference with rigorous technical work on diffusion models in service of our goals to build diffusion models for virtual cells and #AI4Science Congratulations to the amazing team! Paper: arxiv.org/abs/2502.03686 Code: github.com/czi-ai/oc-guid…

🚀 News! Our recent #ICML2025 paper “Variational Control for Guidance in Diffusion Models” introduces a simple yet powerful method for guidance in diffusion models — and it doesn’t need model retraining or extra networks. 📄 Paper: arxiv.org/abs/2502.03686 💻 Code: github.com/czi-ai/oc-guid… A short interactive thread 🧵👇













