David Wessels

14.2K posts

David Wessels

David Wessels

@Dafidofff

PhD candidate w/ @erikjbekkers & @egavves interested in Geometric Deep Learning and Generative Modelling at @AmlabUva

Amsterdam Katılım Kasım 2010
297 Takip Edilen417 Takipçiler
David Wessels retweetledi
Erik Bekkers
Erik Bekkers@erikjbekkers·
Excited to share that we're looking for a new colleague at @AMLab_UvA : Assistant Professor in AI for Science 🔬🤖 AMLab is a world-class ML research group embedded in Amsterdam's thriving AI ecosystem: leading research groups, an ELLIS unit, startups, and big tech — all within reach. And Dutch academic labor conditions are genuinely among the best in Europe ❤️ Deadline: May 30 👉 werkenbij.uva.nl/en/vacancies/a…
Erik Bekkers tweet media
English
3
34
184
20.3K
David Wessels retweetledi
Tin Hadzi Veljkovic
Tin Hadzi Veljkovic@HvTin·
Can a lightweight Transformer compete in crystal generation without equivariance? We show that it can. Crystalite combines chemistry-aware priors with geometric attention biases for: - SOTA CSP - best S.U.N. among all baselines - much faster sampling More info below! 👇
English
4
24
98
25.6K
David Wessels retweetledi
Nabil Iqbal
Nabil Iqbal@nblqbl·
new blog post! in LLMs things can get lost in long contexts. it's been shown that this may involve *positional biases* in transformers: some tokens matter more than others. i review the phenomena and discuss a (new?) way to undo the bias in a toy model. (link below)
Nabil Iqbal tweet media
English
3
6
68
6.8K
David Wessels retweetledi
Olga Zaghen
Olga Zaghen@olgazaghen·
🔮 Working on ML on curved manifolds? Don't miss out on Jacobi Fields! 🔮 I wrote a quick, highly visual and hopefully accessible introduction to the topic: "Jacobi Fields in Machine Learning" 🤠 Check it out here: olgatticus.github.io/blog/jacobi-fi…!
Olga Zaghen tweet media
English
12
66
444
24.9K
David Wessels retweetledi
Alejandro García
Alejandro García@algarciacast·
New paper accepted at @GRaM_org_ !!! 🥳🤩 Here is a small infomercial about it 🤠 If you are interested, you can find extra info in the next tweet 👇
English
4
17
46
4.3K
David Wessels retweetledi
Floor Eijkelboom
Floor Eijkelboom@FEijkelboom·
Discrete diffusion — but fast? ⚡️ Test-time inference — but for discrete data? 🧠 Categorical Flow Maps: continuous transport toward the simplex, turning discrete generation into a single-step problem. Built on Variational FM (CatFlow), we obtain (self-)distillation from scratch. Language, molecules, and test-time steering — one framework. Scaling to LLMs and foundation models next. Watch this space 👀
Oscar Davis@osclsd

You like discrete diffusion, but it's too slow? 🥀 You like test-time inference, but it's for continuous methods? 😩 We fixed it. Introducing Categorical Flow Maps: continuously sample discrete data in a single step 🚀💫 How? 🧵⬇️ 💪 Co-led with @FEijkelboom, @daan_roos_

English
2
18
102
20.9K
David Wessels retweetledi
Erik Bekkers
Erik Bekkers@erikjbekkers·
I usually post on technical research. But clearly, more than ever, we must consider the societal impact of our work. I thus find it important to take a clear position on the moral status of machines and its impact on us humans. New paper w/ the exceptional @AnnaCiaunica👇🧵1/8
Erik Bekkers tweet media
English
3
13
48
9.8K
David Wessels retweetledi
Hansen Lillemark
Hansen Lillemark@hansenlillemark·
State of the art World Models still lack a unified world memory for representing and predicting dynamics out of their field of view. Why is that, and how can we fix it? Introducing Flow Equivariant World Models: models with memory capable of predicting out of view dynamics!🧵⬇️
English
17
102
751
89.3K
David Wessels retweetledi
GRaM Workshop at ICLR 2026
GRaM Workshop at ICLR 2026@GRaM_org_·
📢The second edition of ✨GRaM workshop✨ is here this time at #ICLR26. 🌟Submit your exciting works in Geometry-grounded representations. We welcome submissions in multiple tracks i.e. 📄 Proceedings 📝extended abstract 👩‍🏫Tutorial/blogpost as well as an exciting challenge!
GRaM Workshop at ICLR 2026 tweet media
English
1
22
41
12K
David Wessels retweetledi
Logan Kilpatrick
Logan Kilpatrick@OfficialLoganK·
Just realized Tony Stark was a vibe coder
English
357
553
8.5K
354.5K
David Wessels retweetledi
David Wessels retweetledi
UvA AMLab
UvA AMLab@AmlabUva·
We’re heading to #NeurIPS2025 in San Diego with 9 accepted papers🌴 Check out the full list below and come say hi at the posters! 🧵 0 / 9
English
1
7
15
1.6K
David Wessels retweetledi
Alejandro García
Alejandro García@algarciacast·
✨CAMERA READY UPDATE✨ with new cool plots in which we show how we can use our Equivariant Neural Eikonal Solver for path planning in Riemannian manifolds Check our paper here arxiv.org/pdf/2505.16035 And see you at NeurIPS 🥰
Alejandro García tweet media
Alejandro García@algarciacast

🌍 From earthquake prediction to robot navigation - what connects them? Eikonal equations! We developed E-NES: a neural network that leverages geometric symmetries to solve entire families of velocity fields through group transformations. Grid-free and scalable! 🧵👇

English
2
24
213
21.2K
David Wessels retweetledi
David W. Romero
David W. Romero@davidwromero·
Today, we are releasing Sonic-3, the fastest, most natural AI voice model out there, together with a guide to clone your own in less than 10m. Give it a try! Details on @krandiash's thread below! 👇
Karan Goel@krandiash

We've raised $100M from Kleiner Perkins, Index Ventures, Lightspeed, and NVIDIA. Today we're introducing Sonic-3 - the state-of-the-art model for realtime conversation. What makes Sonic-3 great: - Breakthrough naturalness - laughter and full emotional range - Lightning fast -

English
0
2
24
3.7K
David Wessels retweetledi
Karan Goel
Karan Goel@krandiash·
We've raised $100M from Kleiner Perkins, Index Ventures, Lightspeed, and NVIDIA. Today we're introducing Sonic-3 - the state-of-the-art model for realtime conversation. What makes Sonic-3 great: - Breakthrough naturalness - laughter and full emotional range - Lightning fast -
English
1.4K
1.2K
8.5K
4.9M
David Wessels
David Wessels@Dafidofff·
Recent discussions have largely focused on scaling versus geometry. Another perfect example showcasing that geometry could be made scalable, if we as GDL people start to take scaling seriously. Lets take the best of both worlds 🦾🦾
Max Zhdanov@maxxxzdn

Clifford Algebra Neural Networks are undeservedly dismissed for being too slow, but they don't have to be! 🚀Introducing **flash-clifford**: a hardware-efficient implementation of Clifford Algebra NNs in Triton, featuring the fastest equivariant primitives that scale.

English
0
1
21
3K