Jayjun Lee

133 posts

Jayjun Lee banner
Jayjun Lee

Jayjun Lee

@jayjunleee

Robotics Research Intern @NvidiaAI; Robot Learning PhD student @UMRobotics; Prev @imperialcollege

Beigetreten Nisan 2023
330 Folgt118 Follower
Angehefteter Tweet
Jayjun Lee
Jayjun Lee@jayjunleee·
Has visual fidelity outpaced dynamic fidelity in tactile simulation for sim-to-real transfer of policies? Introducing HydroShear 🏄‍♂️ : a hydroelastic tactile shear simulation for training zero-shot sim-to-real tactile policies in contact-rich tasks where fingertip force and shear matter most! Webpage: hydroshear.github.io Robot videos in 1x 🧵👇
English
3
12
73
7.8K
Jayjun Lee
Jayjun Lee@jayjunleee·
Excited to share our new robot learning sim benchmark RoboMME with a wide range of non-Markovian tasks specifically designed to evaluate different forms of memory for manipulation policies! robomme.github.io
Yinpei Dai@YinpeiD

Robot memory methods are growing fast, but systematic evaluation is largely lacking. 📉 Introducing RoboMME: a new benchmark for memory-augmented robotic manipulation! 🤖🧠 Featuring 16 tasks across temporal, spatial, object, and procedural memory 🔗 robomme.github.io

English
1
1
12
1.2K
Jayjun Lee retweetet
Mustafa Mukadam
Mustafa Mukadam@mukadammh·
high fidelity tactile simulation with shear(!!) to train sim-to-real RL policies years ago I was less bullish on tactile simulation so we doubled down on SSL with just real data to build Sparsh models now we can revist co-training on sim + real data
Jayjun Lee@jayjunleee

Has visual fidelity outpaced dynamic fidelity in tactile simulation for sim-to-real transfer of policies? Introducing HydroShear 🏄‍♂️ : a hydroelastic tactile shear simulation for training zero-shot sim-to-real tactile policies in contact-rich tasks where fingertip force and shear matter most! Webpage: hydroshear.github.io Robot videos in 1x 🧵👇

English
0
5
26
3.8K
Jayjun Lee
Jayjun Lee@jayjunleee·
Has visual fidelity outpaced dynamic fidelity in tactile simulation for sim-to-real transfer of policies? Introducing HydroShear 🏄‍♂️ : a hydroelastic tactile shear simulation for training zero-shot sim-to-real tactile policies in contact-rich tasks where fingertip force and shear matter most! Webpage: hydroshear.github.io Robot videos in 1x 🧵👇
English
3
12
73
7.8K
Stephen James
Stephen James@stepjamUK·
As a newly appointed 𝗔𝘀𝘀𝗶𝘀𝘁𝗮𝗻𝘁 𝗣𝗿𝗼𝗳𝗲𝘀𝘀𝗼𝗿 at @imperialcollege, I'm thrilled to announce the 𝗦𝗮𝗳𝗲 𝗪𝗵𝗼𝗹𝗲-𝗯𝗼𝗱𝘆 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝘁 𝗥𝗼𝗯𝗼𝘁𝗶𝗰𝘀 𝗟𝗮𝗯 (𝗦𝗪𝗜𝗥𝗟) at 𝗜𝗺𝗽𝗲𝗿𝗶𝗮𝗹 𝗖𝗼𝗹𝗹𝗲𝗴𝗲 𝗟𝗼𝗻𝗱𝗼𝗻. 𝗦𝗮𝗳𝗲 𝗪𝗵𝗼𝗹𝗲-𝗯𝗼𝗱𝘆 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝘁 𝗥𝗼𝗯𝗼𝘁𝗶𝗰𝘀 𝗟𝗮𝗯 (𝗦𝗪𝗜𝗥𝗟) (swirl.uk) is a new research lab focused on the intersection of safety and intelligence in next-generation robotics. We're hiring exceptional PhD students who are passionate about pushing the boundaries of robot learning. 𝗪𝗵𝗮𝘁 𝗺𝗮𝗸𝗲𝘀 𝗦𝗪𝗜𝗥𝗟 𝘂𝗻𝗶𝗾𝘂𝗲? We operate at the exciting convergence of: • Online & offline reinforcement learning • Imitation learning & human demonstrations • Sample-efficient learning methods • Whole-body and soft robotics systems We're 𝗹𝗼𝗼𝗸𝗶𝗻𝗴 𝗳𝗼𝗿 𝗽𝗿𝗼𝘀𝗽𝗲𝗰𝘁𝗶𝘃𝗲 𝗣𝗵𝗗 𝘀𝘁𝘂𝗱𝗲𝗻𝘁𝘀 interested in: • Developing safe exploration algorithms for robotic systems • Creating sample-efficient learning methods that minimize real-world trials • Building foundation models for robotics with safety guarantees • Advancing soft robotics and compliant human-robot interaction • Bridging theory and practice in embodied AI Why now? As robots become more capable and work closer with humans, we need systems that are both intelligent enough to handle complex tasks 𝗔𝗡𝗗 safe enough for real-world deployment. Traditional approaches treat safety and intelligence as competing priorities, we believe they're synergistic. If you're a motivated researcher who wants to develop the theoretical foundations and practical algorithms for tomorrow's safe, intelligent robots, I'd love to hear from you. Want to join? Apply via swirl.uk
English
13
14
179
16.4K
Jayjun Lee retweetet
Martin Ziqiao Ma
Martin Ziqiao Ma@ziqiao_ma·
Thanks @_akhaliq for sharing our work! The core takeaway from AimBot is straightforward: explicit spatial cues (as shooting lines & scope reticles) are strong 2.5D augmentations that enhance the spatial grounding of any VLA models. These cues are interpretable and also strengthen proprioceptive state encoding. To appear in #CoRL2025 @corl_conf. Find out more here -> aimbot-reticle.github.io
AK@_akhaliq

AimBot A Simple Auxiliary Visual Cue to Enhance Spatial Awareness of Visuomotor Policies

English
1
5
43
11.9K
Jayjun Lee
Jayjun Lee@jayjunleee·
To wrap up, I think simulation and sim2real methods are great for acquiring multi-sensory robot data to gain understanding about the physical world and for learning dexterous manipulation skills! Thanks to my awesome advisor @NimaFazeli7! @UMRobotics
English
0
1
4
144
Jayjun Lee
Jayjun Lee@jayjunleee·
We then zero-shot transfer to the real world and demonstrate its performance in both in-hand object pose and extrinsic contact patch predictions from partial point cloud and tactile shear feedback.
English
1
1
4
200
Jayjun Lee
Jayjun Lee@jayjunleee·
How can robots feel and localize extrinsic contacts through both vision👀and touch🖐️? #RSS2025 @RoboticsSciSys Introducing ViTaSCOPE: a sim-to-real visuo-tactile neural implicit representation for in-hand object pose and extrinsic contact estimation! jayjunlee.github.io/vitascope/
English
3
4
30
3.3K