Ye He

8 posts

Ye He

Ye He

@YeHeMath

Hale Visiting Assistant Professor at Georgia Institute of Technology #sampling #probability #statistics #machinelearning

Atlanta Joined Temmuz 2023
39 Following65 Followers
Xueyan Zou
Xueyan Zou@xyz2maureen·
I will join Tsinghua University, College of AI, as an Assistant Professor in the coming month. I am actively looking for 2026 spring interns and future PhDs (ping me if you are in #NeurIPS). It has been an incredible journey of 10 years since I attended an activity organized by Tsinghua University and decided to change my undergraduate major from Economics to Computer Science, inspired by one of the teammates. During the 10 years, I met with appreciation of many wonderful researchers/professors who led me to continued growth. 🐿️ My research focus will continue to be AI & Robotics, with a specific emphasis on Interactive Embodied Intelligence. You can check my homepage to learn more: maureenzou.github.io/lab.html. I am currently local to San Diego and will be attending #NeurIPS. Please ping me over WeChat or Email if any old or new friends are interested in having a coffee chat! (Really looking forward to meeting as many friends as possible at #NeurIPS) [The photo is one of the places that I will miss a lot in the US]
Xueyan Zou tweet media
English
69
85
1.1K
111.4K
Ye He retweeted
Molei Tao
Molei Tao@MoleiTaoMath·
Nonconvex optimization can be hard. Sampling, as a stochastic generalization, is not always easier. What about a case further complicated by nonconvex ineq & equality constraints? arxiv.org/abs/2510.22044 (#NeurIPS2025) introduces a new tool, and samples exponentially fast!
English
1
7
38
3.7K
Ye He
Ye He@YeHeMath·
Summary: This work offers a statistical perspective on CFG in discrete diffusion: • Explicit 1D/2D formulas • Discussion of sampled distributions in terms of supports and local covariances • Total variation bounds on convergence rates Glad to discuss with anyone interested!
English
0
0
2
210
Ye He
Ye He@YeHeMath·
New preprint: “What Exactly Does Guidance Do in Masked Discrete Diffusion Models” arxiv.org/abs/2506.10971, joint work with @KevRojas1499 @MoleiTaoMath. We provide a rigorous theoretical analysis of the sampling dynamics in masked discrete diffusion with classifier-free guidance.
English
5
12
96
7.6K
Ye He
Ye He@YeHeMath·
TV distance to the generated distribution decays double-exponentially in guidance strength. • Strong guidance = faster convergence. But also more aggressive shifts = less numerical stability
English
0
0
1
122
Ye He
Ye He@YeHeMath·
Guidance sharpens support around the target label and suppresses overlap. • 1-token: samples exactly from the tilted distribution, unlike continuous CFG • Multi-token: differs from the tilted distribution • We provide explicit formulas for the generated distributions
English
0
0
2
127
Ye He
Ye He@YeHeMath·
In discrete diffusion, guidance isn’t just steering — it reshapes the distribution. We analyze: • How CFG affects the sampled distributions • How it changes how fast sampling converges
English
0
0
2
179