Youngsun Wi

60 posts

Youngsun Wi

Youngsun Wi

@WiYoungsun

PhD Student doing research on robotic manipulation 🦾🤖| Currently at @UMRobotics | Interned at @AIatMeta @NVIDIARobotics @amazondrives

Katılım Ocak 2022
192 Takip Edilen422 Takipçiler
Sabitlenmiş Tweet
Youngsun Wi
Youngsun Wi@WiYoungsun·
Dexterous hands vary widely—so do tactile modalities. 🖐️🌈 Our vision on tactile human-to-robot transfer: 🔓 Not tied to specific hardware ♻️ Reuse human tactile demos across embodiments Presenting TactAlign, a cross-sensor tactile alignment for cross-embodiment policy transfer.
English
5
34
149
19.5K
Youngsun Wi retweetledi
Chan Hee (Luke) Song
Chan Hee (Luke) Song@luke_ch_song·
🧠 Excited to share our vision for embodied systems: generalist models orchestrating specialist perception tools for spatial reasoning. 🛠️ Even robots themselves are treated as tools for manipulation. 🚀 Great work led by @ChenSiyich and the team at @nvidia behind RoboSpatial!
Siyi Chen@ChenSiyich

🎉 Accepted to #CVPR2026 🔎 VLMs fall short on complex spatial reasoning. They struggle with: • Precise geometric perception • Multi-step reasoning grounded in 3D • Adapting perception dynamically to task and context 🚀 We propose a solution: visual tool-augmented spatial reasoning — bridging perception and multi-step reasoning through diverse, error-aware, adaptive vision tool use. And we go one step further: 🤖 enabling robot control by treating robots themselves as tools. Our framework is powered by: ⚡ Double Interactive RL (DIRL), a new training framework combining demonstrations + real exploration 🛠 Real interaction with specialized computer vision models during RL 🤖 Toolshed, a scalable, asynchronous system for multimodal execution of vision tools and robots-as-tools 🔗 Project: spacetools.github.io Code: github.com/spacetools/Spa… Toolshed is released with frontier-model demos. Full training & evaluation release coming soon. Done during my internship @NVIDIA — big thanks to the amazing collaborators! 🙌 #CVPR2026 #EmbodiedAI #ComputerVision #Robotics #ReinforcementLearning #MultimodalAI

English
0
7
20
2.1K
Youngsun Wi retweetledi
Carolina Higuera
Carolina Higuera@carohiguerarias·
Most world models for robot manipulation learn physics from pixels. But pixels don’t see it all. Can we ground these models in the "feeling" of contact to disambiguate visually identical states? Visuo-Tactile World Models (VT-WM): robot imagination in a shared space👇
English
3
16
98
9.1K
Youngsun Wi retweetledi
Chan Hee (Luke) Song
Chan Hee (Luke) Song@luke_ch_song·
🚀 Freshly accepted to CVPR 2026 What if we could train computer-using agents just by watching YouTube? We present Watch & Learn (W&L) -- a inverse-dynamics framework that turns internet videos of humans using computers into learnable UI trajectories at scale. Thread 👇
Chan Hee (Luke) Song tweet media
English
4
24
157
11K
Youngsun Wi
Youngsun Wi@WiYoungsun·
@OpenGraph_Labs Thank you! 🙏 We also believe that gloves with tactile sensing are an excellent medium for collecting dexterous human demonstrations — a powerful source for robot dexterity :)
English
0
0
1
50
Youngsun Wi
Youngsun Wi@WiYoungsun·
Dexterous hands vary widely—so do tactile modalities. 🖐️🌈 Our vision on tactile human-to-robot transfer: 🔓 Not tied to specific hardware ♻️ Reuse human tactile demos across embodiments Presenting TactAlign, a cross-sensor tactile alignment for cross-embodiment policy transfer.
English
5
34
149
19.5K
Youngsun Wi
Youngsun Wi@WiYoungsun·
@mangahomanga Thanks so much @mangahomanga 🙏! Bringing tactile into human data feels like a key step for dexterous H2R! I really appreciate how consistently you’ve been pushing H2R forward in vision spaces. I always learn a lot. Excited to see what you do next and how the space evolves 🪄✨
English
0
0
1
188
Homanga Bharadhwaj
Homanga Bharadhwaj@mangahomanga·
@WiYoungsun Congrats Youngsun! Great to see this out. I'm a big fan of trying to extract and predict tactile sensing information form hunan data, for manipualtion.
English
1
0
3
394
Youngsun Wi
Youngsun Wi@WiYoungsun·
Also, huge thanks to everyone who has provided valuable input and support throughout: @luke_ch_song, Fan Yang, @MarkVanderMerwe, James Lorenz, @TingfanW1208, Francois Hogan, Taosha Fan, Sayantan Kundu, Mike Lambeta, Wonik Robotics.
English
0
0
5
518
Youngsun Wi
Youngsun Wi@WiYoungsun·
We’ve open-sourced OSMO, a tactile glove with full-palm coverage that measures both normal and shear forces, with a form factor that enables data collection inheriting human dexterity! Details:👇
Haozhi Qi@HaozhiQ

Human videos are great for robot learning, but they critically lack the sense of touch needed for complex manipulation tasks! Introducing OSMO, an open-source tactile glove that captures rich contact signals during human demonstrations for direct, successful transfer to robots.

English
1
1
18
1.9K
Adithya Murali
Adithya Murali@Adithya_Murali_·
Happy to share that I’ve been selected for MIT Technology Review Innovators Under 35 Asia Pacific 2025! @techreview 🙏 Grateful to my mentors and research collaborators at @NVIDIARobotics, CMU and beyond. This recognizes some of our work on scaling robot learning with procedural simulation and low-cost robots back in the day at @CMU_Robotics. Congratulations to all the other awardees recognized this year. 🔗 tr35.mittrasia.com/awards
Adithya Murali tweet media
English
9
4
81
9.7K