Venkatesh

56 posts

Venkatesh

Venkatesh

@venkyp2000

Founding RE | prev robot therapist @CILVRatNYU @nyuniversity ; alum @IITIOfficial

New York City Katılım Mart 2017
133 Takip Edilen281 Takipçiler
Sabitlenmiş Tweet
Venkatesh
Venkatesh@venkyp2000·
Making touch sensors has never been easier! Excited to present eFlesh, a 3D printable tactile sensor that aims to democratize robotic touch. All you need to make your own eFlesh is a 3D printer, some magnets and a magnetometer. See thread 👇and visit e-flesh.com
English
8
88
555
69.1K
Venkatesh retweetledi
Irmak Guzey
Irmak Guzey@irmakkguzey·
Learning from human data requires human-like hardware. Humans use their wrists constantly, but table-top manipulators lack this flexibility. We build upon RUKA and introduce RUKA-v2: a tendon-driven hand with a 2-DOF wrist and finger abduction/adduction 👋✌️
English
7
28
113
8.1K
Venkatesh retweetledi
Siddhant Haldar
Siddhant Haldar@haldar_siddhant·
Robot foundation models are limited by costly real data, while simulation data is plentiful but visually mismatched to reality. We present Point Bridge, a method that enables zero-shot sim-to-real transfer for robot learning with minimal visual alignment. pointbridge3d.github.io
English
4
41
221
19.2K
Venkatesh retweetledi
Lerrel Pinto
Lerrel Pinto@LerrelPinto·
Introducing YOR. Balancing budget and functionality for a capable mobile robot is always a challenge. To give researchers and hobbyists more options, we built our own open-source one for ~$10k.
English
1
9
55
3.2K
Venkatesh retweetledi
Jeff Cui
Jeff Cui@jeffacce·
We don't need the name of an object to pick it up; we simply need to know where it is and what it looks like. Introducing Contact-Anchored Policies (CAPs): instead of language, we explicitly condition on contacts. Our policy learns object pickup with only 16 hours of data! 🧵
English
5
28
111
12.7K
Venkatesh retweetledi
Venkatesh retweetledi
Irmak Guzey
Irmak Guzey@irmakkguzey·
Dexterous manipulation by directly observing humans - a dream in AI for decades - is hard due to visual and embodiment gaps. With simple yet powerful hardware - Aria 2 glasses 👓 - and our new work AINA 🪞, we are now one significant step closer to achieving this dream.
English
7
33
145
34K
Venkatesh retweetledi
Raunaq Bhirangi
Raunaq Bhirangi@Raunaqmb·
When @anyazorin and @irmakkguzey open-sourced the RUKA Hand (a low-cost robotic hand) earlier this year, people kept asking us how to get one. Open hardware isn’t as easy to share as code. So we’re releasing an off-the-shelf RUKA, in collaboration with @WowRobo and @zhazhali01.
English
14
42
252
48K
Venkatesh retweetledi
Zifan Zhao
Zifan Zhao@Zifan_Zhao_2718·
🚀 With minimal data and a straightforward training setup, our VisualTactile Local Policy (ViTaL) fuses egocentric vision + tactile feedback to achieve millimeter-level precision & zero-shot generalization! 🤖✨ Details ▶️ vitalprecise.github.io
English
1
9
35
4K
Venkatesh retweetledi
Siddhant Haldar
Siddhant Haldar@haldar_siddhant·
Current robot policies often face a tradeoff: they're either precise (but brittle) or generalizable (but imprecise). We present ViTaL, a framework that lets robots generalize precise, contact-rich manipulation skills across unseen environments with millimeter-level precision. 🧵
English
9
79
592
145.8K
Venkatesh retweetledi
Raunaq Bhirangi
Raunaq Bhirangi@Raunaqmb·
Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.🧶⬇️
English
5
27
227
31.8K
Venkatesh retweetledi
Minyoung Hwang
Minyoung Hwang@robominyoung·
Interested in how generative AI can be used for human-robot interaction? We’re organizing the 2nd Workshop on Generative AI for Human-Robot Interaction (GenAI-HRI) at #RSS2025 in LA — bringing together the world's leading experts in the field. The workshop is happening on Wed, June 25th (Wed) @ USC 🌴 📍Location: RTH 109 Let’s connect if you’re into 🤖 Robotics + 🧠 Generative AI for real-world HRI! Don’t miss our stacked lineup of speakers👇 🔗 sites.google.com/view/gai-hri/h… @DorsaSadigh @ybisk @V_Vanhoucke @TapoBhat @RoozbehMottaghi @brenna_argall @robo_kween
Minyoung Hwang tweet media
English
2
5
20
4.8K
Venkatesh retweetledi
Lerrel Pinto
Lerrel Pinto@LerrelPinto·
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵
English
125
724
6.9K
597.5K
Venkatesh retweetledi
Raunaq Bhirangi
Raunaq Bhirangi@Raunaqmb·
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting “print”? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. 🧶👇
English
19
103
822
85.6K
Venkatesh
Venkatesh@venkyp2000·
We train some precise robot policies for contact-rich tasks like erasing and insertion, using Visuo-Skin, achieving over 90% success. We also evaluate a learned slip detection model on unseen objects, and the model achieves a 95% success rate.
English
1
1
7
951
Venkatesh
Venkatesh@venkyp2000·
Making touch sensors has never been easier! Excited to present eFlesh, a 3D printable tactile sensor that aims to democratize robotic touch. All you need to make your own eFlesh is a 3D printer, some magnets and a magnetometer. See thread 👇and visit e-flesh.com
English
8
88
555
69.1K