Stephan Mandt

505 posts

Stephan Mandt banner
Stephan Mandt

Stephan Mandt

@StephanMandt

AI Professor @UCIrvine | Associate Director, AI in Science Institute | Formerly @blei_lab, @Princeton | Chair @aistats_conf 2025 | AI Resident @ChanZuckerberg

Irvine, California Katılım Mart 2015
604 Takip Edilen2.9K Takipçiler
Stephan Mandt retweetledi
Symposium on Probabilistic Machine Learning
ProbML 2026 (formerly AABI) invites submissions on probabilistic ML (Bayesian & beyond), July 5 in Seoul (co-located with ICML). Website: probml.cc. Tracks: proceedings (PMLR), workshop, fast track. New focus includes healthcare & climate! Submit by: 20 March 2026
Symposium on Probabilistic Machine Learning tweet media
English
2
13
20
7.5K
Stephan Mandt
Stephan Mandt@StephanMandt·
Excited to contribute to a growing scientific ecosystem in SoCal through our new AI in Science Institute at UCI. Scientific AI raises long-term questions—central to our inaugural symposium, from agentic co-scientists to weather to biology. Join us next year—sun included ☀️
Stephan Mandt tweet mediaStephan Mandt tweet mediaStephan Mandt tweet media
English
2
1
20
2.1K
Stephan Mandt
Stephan Mandt@StephanMandt·
Congrats to @FelixDrRelax, Yang Meng, and Lukas Laskowski on a NeurIPS Spotlight! 🎉 A simple idea made practical, demonstrated on event sequences, for efficiently modeling mixed discrete-continuous data with transformers.
Stephan Mandt tweet media
English
0
2
12
1.3K
Stephan Mandt
Stephan Mandt@StephanMandt·
Recently gave a LEAP lecture at @Columbia and at @UCLA on a question I’m excited about: How can we design diffusion models for scientific inference—uncertainty-aware, calibrated, steerable, and heavy-tailed? youtube.com/watch?v=QeLZI4…
YouTube video
YouTube
English
0
1
12
1.8K
Stephan Mandt
Stephan Mandt@StephanMandt·
Amid all the review frustration, a big shoutout to all reviewers and area chairs. Peer feedback is a crucial step in developing papers---and it takes serious time and effort. As authors, let’s appreciate the process!
English
0
0
9
705
Stephan Mandt
Stephan Mandt@StephanMandt·
When a single telescope is projected to stream ~62 exabytes of data every year, we need better compression. Learned compression is one answer--check out our new project page here:
Rithwik Sudharsan@aRithmetix

Made a pretty website for our ICLR 2025 work AstroCompress: neural compression for space telescopes + 320 GB of ML-ready astro image data. rithwiksud.github.io/astrocompress-… Has links to paper, data, code, Jupyter notebook, reviews, & ICLR video presentation.

English
0
1
5
787
Stephan Mandt
Stephan Mandt@StephanMandt·
Huge thanks to Laura Manduchi, Clara Meister & Kushagra Pandey, who led the 2-year effort of writing “On the Challenges and Opportunities in Generative AI” involving 27 authors. Coming out of a 2023 Dagstuhl Seminar I co-organized with @vincefort, @liyzhen2 & @sirbayes.
Clara Isabel Meister@clara__meister

Exciting news! Our paper "On the Challenges and Opportunities in Generative AI" has been accepted to TMLR 2025. 📄 arxiv.org/abs/2403.00025

English
0
2
14
2.3K
Stephan Mandt retweetledi
Unnat Jain
Unnat Jain@unnatjain2010·
✨New edition of our community-building workshop series!✨ Tomorrow at @CVPR, we invite speakers to share their stories, values, and approaches for navigating a crowded and evolving field, especially for early-career researchers. Cheeky title🤭: How to Stand Out in the Crowd🙋? Details & context here: sites.google.com/view/standoutcv
Anand Bhattad@anand_bhattad

In this #CVPR2025 edition of our community-building workshop series, we focus on supporting the growth of early-career researchers. Join us tomorrow (Jun 11) at 12:45 PM in Room 209 Schedule: sites.google.com/view/standoutc… We have an exciting lineup of invited talks and candid panels: @sarameghanbeery, @dimadamen, @jbhuang0604, @lealtaixe, @LerrelPinto, @lschmidt3, @shubhtuls, @gulvarol, @cvondrick, @sainingxie Co-organizing with @unnatjain2010, @ap229997, @georgiagkioxari, @akanazawa, and Lana Lazebnik @CVPR

English
3
14
66
12.5K
Stephan Mandt
Stephan Mandt@StephanMandt·
Thanks for the great collaboration!
Theofanis Karaletsos@Tkaraletsos

ICML 25 paper on variational guidance for diffusion models accepted Happy to share that our diffusion model guidance paper with @farrinsofian, @kpandey008, @felixDrRelax, and @StephanMandt on casting control for guidance as variational inference with auxiliary variables was accepted at ICML 25. This is a collaboration within our AI residency at @cziscience with the @UCIrvine team around Stephan Mandt, and very exciting for a number of reasons: •We believe that unconditional diffusion models will be an important foundational building block of virtual cells (stay put for more), and guidance is the mechanism to steer them towards acting as virtual instruments running experiments. Improving such guidance mechanisms is a key capability we care about •Optimizing our auxiliary latent variables via variational inference has the flavor of a structured form of test-time compute with guaranteed improvements (tighter lower bounds!) as we invest more computation! Test time compute is huge in reasoning models, it will also increasingly be obvious as a component in diffusion models where we analogize that “posterior inference = search” and work through ways to simulate objects using our generative models with more accuracy and faithfulness •Along the way, the paper proves significant gains in common benchmarking tasks for guidance systems for common ML tasks as Farrin shows in her tweetorial Very excited about the collaboration and being able to bring together Stephan’s and my passion for approximate inference with rigorous technical work on diffusion models in service of our goals to build diffusion models for virtual cells and #AI4Science Congratulations to the amazing team! Paper: arxiv.org/abs/2502.03686 Code: github.com/czi-ai/oc-guid…

English
0
0
35
787
Stephan Mandt
Stephan Mandt@StephanMandt·
TL;DR: Guidance = variational optimal control. I'm excited to share the outcomes of this collaboration with @Tkaraletsos at the @ChanZuckerberg_ Initiative. All credit to my amazing students @farrinsofian and @kpandey008!
Farrin Marouf Sofian@farrinsofian

🚀 News! Our recent #ICML2025 paper “Variational Control for Guidance in Diffusion Models” introduces a simple yet powerful method for guidance in diffusion models — and it doesn’t need model retraining or extra networks. 📄 Paper: arxiv.org/abs/2502.03686 💻 Code: github.com/czi-ai/oc-guid… A short interactive thread 🧵👇

English
0
1
51
1.2K
Stephan Mandt retweetledi
yingzhen
yingzhen@liyzhen2·
#AISTATS2025 day 3 keynote by Akshay Krishnamurthy about how to do theory research on inference time compute 👍 @aistats_conf
yingzhen tweet media
English
1
7
140
14.9K
Stephan Mandt
Stephan Mandt@StephanMandt·
Back from @aistats_conf in Thailand to my Zurich sabbatical—sipping coffee in the same spots Einstein once did. What a journey! Huge thanks to my entire AISTATS team: reviewers, ACs, senior ACs, and Chairs. It’s been amazing to work with you!
Stephan Mandt tweet mediaStephan Mandt tweet mediaStephan Mandt tweet media
English
5
1
69
3K
Stephan Mandt retweetledi
AISTATS Conference
AISTATS Conference@aistats_conf·
And last but not least... the Best Student Paper Award at #AISTATS 2025 goes to Daniel Marks and Dario Paccagnan for "Pick-to-Learn and Self-Certified Gaussian Process Approximations". Congratulations!
AISTATS Conference tweet media
English
0
3
69
4.6K
Stephan Mandt retweetledi
AISTATS Conference
AISTATS Conference@aistats_conf·
The #AISTATS 2025 Test of Time Award goes to ... 🥁 ... Chen-Yu Lee, Saining Xie, Patrick Gallagher, Zhengyou Zhang, Zhuowen Tu, for "Deeply Supervised Nets"! Congratulations!
AISTATS Conference tweet media
English
2
5
71
91.7K