Discrete Diffusion Reading Group

137 posts

Discrete Diffusion Reading Group banner
Discrete Diffusion Reading Group

Discrete Diffusion Reading Group

@diffusion_llms

📚 Journal club on discrete diffusion models 🎥 Replays available on YouTube! Contact: [email protected] Hosted by @ssahoo_, @jdeschena, @zhihanyang_

Katılım Ağustos 2025
0 Takip Edilen2K Takipçiler
Sabitlenmiş Tweet
Discrete Diffusion Reading Group
Discrete Diffusion Reading Group@diffusion_llms·
Drowning in the sea of Discrete Diffusion papers? 🌊 We got you. Join our Reading Group! From theory → empirics, and language → molecules — we’ll decode the chaos together 💫 Join the cult—uh, I mean community 😇 👉 Google Group:  groups.google.com/g/diffusion-ll… (1 / 2)
Discrete Diffusion Reading Group tweet media
English
2
7
38
8.9K
Discrete Diffusion Reading Group
📢 May 18 (Mon): IDLM: Inverse-distilled Diffusion Language Models 🤔Diffusion Language Models (DLMs) have recently achieved strong results in text generation. However, their multi-step sampling leads to slow inference, limiting practical use. 💡To address this, the authors extend Inverse Distillation, a technique originally developed to accelerate continuous diffusion models, to the discrete setting. However, this extension introduces both theoretical and practical challenges. 🔧To overcome these challenges, the authors first provide a theoretical result demonstrating that their inverse formulation admits a unique solution, thereby ensuring valid optimization. They then introduce gradient-stable relaxations to support effective training. 📊As a result, experiments on multiple DLMs show that their method, Inverse-distilled Diffusion Language Models (IDLM), reduces the number of inference steps by 4×—64×, while preserving the teacher model’s entropy and generative perplexity. This Monday, David Li (scholar.google.com/citations?user…) and Nikita Gushchin (scholar.google.com/citations?user…) will present their jointly led paper, which was recently accepted at ICML 2026. Collaborators of this work include: Dmitry Abulkhanov (@dabulkhanov_), Eric Moulines (scholar.google.com/citations?user…), Ivan Oseledets (@oseledetsivan), Maxim Panov (@maxim_panov), Alexander Korotin (akorotin.netlify.app) Paper link: arxiv.org/abs/2602.19066
Discrete Diffusion Reading Group tweet media
English
1
5
17
1.8K
Discrete Diffusion Reading Group retweetledi
Zhihan Yang
Zhihan Yang@zhihanyang_·
Join our reading group next Monday! Paper: IDLM: Inverse-distilled Diffusion Language Models Presenters: David Li, Nikita Gushchin
Discrete Diffusion Reading Group@diffusion_llms

📢 May 18 (Mon): IDLM: Inverse-distilled Diffusion Language Models 🤔Diffusion Language Models (DLMs) have recently achieved strong results in text generation. However, their multi-step sampling leads to slow inference, limiting practical use. 💡To address this, the authors extend Inverse Distillation, a technique originally developed to accelerate continuous diffusion models, to the discrete setting. However, this extension introduces both theoretical and practical challenges. 🔧To overcome these challenges, the authors first provide a theoretical result demonstrating that their inverse formulation admits a unique solution, thereby ensuring valid optimization. They then introduce gradient-stable relaxations to support effective training. 📊As a result, experiments on multiple DLMs show that their method, Inverse-distilled Diffusion Language Models (IDLM), reduces the number of inference steps by 4×—64×, while preserving the teacher model’s entropy and generative perplexity. This Monday, David Li (scholar.google.com/citations?user…) and Nikita Gushchin (scholar.google.com/citations?user…) will present their jointly led paper, which was recently accepted at ICML 2026. Collaborators of this work include: Dmitry Abulkhanov (@dabulkhanov_), Eric Moulines (scholar.google.com/citations?user…), Ivan Oseledets (@oseledetsivan), Maxim Panov (@maxim_panov), Alexander Korotin (akorotin.netlify.app) Paper link: arxiv.org/abs/2602.19066

English
0
2
9
775
Discrete Diffusion Reading Group
📢 May 11 (Mon): Unifying Masked Diffusion Models with Various Generation Orders and Beyond 🤔AR generates left-to-right; masked diffusion generates in any order; and block diffusion generates block-wise left-to-right, with random order within each block. Can we unify all these frameworks and further learn the generation order jointly with token prediction? 💡The authors propose OeMDM, a unified masked diffusion framework that can express various generation orders, and LoMDM, which jointly learns the generation order and the diffusion model. 🔍Everything comes down to the scheduler: by making the forward and reverse schedulers maximally flexible, it becomes possible to describe all generation orders, even learnable generation orders, within the masked diffusion framework. 📈LoMDM achieves SOTA among discrete diffusion models across all benchmarks, and even outperforms block diffusion models, which strongly benefit from left-to-right bias! This Monday, Chunsan Hong (@ChunsanHong) will present his paper, which received Spotlight at ICML 2026. Collaborators of this work include: Sanghyun Lee, Jong Chul Ye (bispl.weebly.com/professor.html) Paper link: arxiv.org/abs/2602.02112
Discrete Diffusion Reading Group tweet media
English
2
5
23
4.6K
Discrete Diffusion Reading Group retweetledi
Discrete Diffusion Reading Group retweetledi
Discrete Diffusion Reading Group retweetledi
Subham Sahoo
Subham Sahoo@ssahoo_·
Giving a talk on Eso-LMs at the MM Intelligence Workshop ⏲️ Saturday, 10:45 - 11 am 📍204 C Summary: MDLMs with Causal Attention support: > Exact KV caching (unlike block diffusion) > Single Pass NELBO estimation => Improved RL post-training > Exact Likelihood Computation!!!
Subham Sahoo@ssahoo_

Esoteric Language Models 🔥Beats MDLM on the speed-quality Pareto frontier 🔥Exact KV Caching 🔥 Exact Likelihood Computation 🔖 arxiv.org/abs/2506.01928 🖥️s-sahoo.com/Eso-LMs/ x.com/ssahoo_/status…

English
2
4
14
1.9K
Discrete Diffusion Reading Group retweetledi
Subham Sahoo
Subham Sahoo@ssahoo_·
✈️ Discrete Diffusion Meetup @iclr_conf 📅 Today, April 24 (Fri) | 4PM We meet at this location in the middle of the conference center and talk shop!
Subham Sahoo tweet media
Subham Sahoo@ssahoo_

✈️ Discrete Diffusion Meetup @iclr_conf 📅 RioCentro | April 24 (Thurs) | 4PM I’ll share the exact location in the comments as we get closer. Save this post so you don’t miss the update.

English
1
5
25
3K
Discrete Diffusion Reading Group
Discrete Diffusion Reading Group@diffusion_llms·
Let's meet in the garden in the middle of the conference center, near the white structure and under the trees 🚀
Discrete Diffusion Reading Group tweet media
English
0
0
0
197
Discrete Diffusion Reading Group retweetledi
Justin Deschenaux
Justin Deschenaux@jdeschena·
🇧🇷 I'll be in Rio de Janeiro for ICLR from tomorrow, where I will present 4 of our recent works on diffusion language models, including PGM (oral) and BlockGen (workshop spotlight talk). I'll be happy to meet and catch up in person, please reach out if you'd like to chat :)
Justin Deschenaux tweet media
English
0
7
42
3.3K
Discrete Diffusion Reading Group retweetledi
Subham Sahoo
Subham Sahoo@ssahoo_·
Attending @iclr_conf to present the following papers-- feel free to reach out if you’d like to schedule a 1:1. > The Diffusion Duality, Chapter 2: Psi Samplers 📅 Friday, Apr 24 🕦 10:30 - 1 pm 📍 Pavilion 3 P3-#723 > Esoteric Language Models Oral at MM Intelligence workshop. 📅 Sunday, Apr 26 🕦 10:45 - 11:00 am 📍 Room 204 C
Subham Sahoo tweet media
English
2
7
56
4.2K