Haritheja

45 posts

Haritheja

Haritheja

@HarithejaE

CS PhD student @Berkeley_AI

انضم Mayıs 2023
108 يتبع121 المتابعون
تغريدة مثبتة
Haritheja
Haritheja@HarithejaE·
Introducing Robot Utility Models! Deploy policies for a variety of simple manipulation tasks in your environment zero-shot, without additional data or further training. The key? Collecting diverse and *high quality* data. Read more in our paper and try them out for yourself! This was work co-led w/ @notmahi, with help from many amazing collaborators at @nyuniversity and @hellorobotinc.
Lerrel Pinto@LerrelPinto

Super excited for the release of Robot Utility Models (RUMs)! RUMs is a simple method to build zero-shot robot policies that can solve useful tasks in completely new homes without any additional training often at 90%+ success rate. 🧵👇

English
3
4
25
3.2K
Haritheja أُعيد تغريده
Will Liang @ ICLR
Will Liang @ ICLR@willjhliang·
Introducing Tether 🪢, a fun little idea to scale data by having our robot “play” in the real world for over 24 hours, throughout the day and overnight—improving policies from zero to mastery with minimal supervision! But play is messy, with out-of-distribution scenarios that are hard to anticipate. To perform autonomous functional play in the real world, from just a handful of demos, we propose a highly robust few-shot imitation method that warps demo trajectories using visual correspondences. Then, continuously running it within a multi-task VLM-guided cycle, we generate a data stream that produces 1000+ expert-level demos. This generated data is finally funneled downstream to train imitation learning policies, which improve from zero to near-perfect success rates. We’ll be presenting Tether at #ICLR2026 in just a few weeks! But before that, deep dive with me… 🧵
English
7
44
272
44.2K
Haritheja أُعيد تغريده
Jeff Cui
Jeff Cui@jeffacce·
Fully open-source, customizable hardware is the way for robotics research. Introducing Your Own Robot (YOR), a mobile bimanual robot platform for ~$10k.
English
2
4
32
1.5K
Haritheja أُعيد تغريده
Mahi Shafiullah 🏠🤖
Mahi Shafiullah 🏠🤖@notmahi·
Why buy a robot when you can build your own? Meet YOR, our new open-source bimanual mobile manipulator robot – built for researchers and hackers alike for only ~$10k. 🧵👇
English
7
22
171
37.3K
Haritheja أُعيد تغريده
Jeff Cui
Jeff Cui@jeffacce·
We don't need the name of an object to pick it up; we simply need to know where it is and what it looks like. Introducing Contact-Anchored Policies (CAPs): instead of language, we explicitly condition on contacts. Our policy learns object pickup with only 16 hours of data! 🧵
English
5
28
111
12.7K
Haritheja أُعيد تغريده
Mahi Shafiullah 🏠🤖
Mahi Shafiullah 🏠🤖@notmahi·
Best ideas are often the simplest in hindsight. Meet Contact-Anchored Policies (CAP)🧢: by conditioning policies on physical contact (vs language) we achieve env & embodiment generalization with super low resources. This policy ⬇️ learned to pick from scratch w/ 16 hrs of data 🧵
English
7
31
174
16.2K
Haritheja أُعيد تغريده
Haozhi Qi
Haozhi Qi@HaozhiQ·
I will join UChicago CS @UChicagoCS as an Assistant Professor in late 2026, and I’m recruiting PhD students in this cycle (2025 - 2026). My research focuses on AI & Robotics - including dexterous manipulation, humanoids, tactile sensing, learning from human videos, robot systems, and anything needed to make robots truly work and improve everyday life. I also place strong emphasis on open-source. Check my homepage to learn more: haozhi.io. Please reachout if you are interested! The deadline is Dec 11th. Link: tinyurl.com/uchiapp.
Haozhi Qi tweet media
English
26
101
646
104.6K
Haritheja أُعيد تغريده
Alper Canberk
Alper Canberk@alpercanbe·
When training ACT-1, we treated data from diverse, long-horizon tasks in the wild as a first-class citizen. This makes generalization the default, not an exception. The capability envelope expands. More to come.
English
43
68
1K
390.1K
Haritheja أُعيد تغريده
Ruoshi Liu
Ruoshi Liu@ruoshi_liu·
Everyone says they want general-purpose robots. We actually mean it — and we’ll make it weird, creative, and fun along the way 😎 Recruiting PhD students to work on Computer Vision and Robotics @umdcs for Fall 2026 in the beautiful city of Washington DC!
Ruoshi Liu tweet mediaRuoshi Liu tweet mediaRuoshi Liu tweet media
English
31
76
495
114.5K
Haritheja أُعيد تغريده
Harsh Gupta
Harsh Gupta@hgupt3·
✈️🤖 What if an embodiment-agnostic visuomotor policy could adapt to diverse robot embodiments at inference with no fine-tuning? Introducing UMI-on-Air, a framework that brings embodiment-aware guidance to diffusion policies for precise, contact-rich aerial manipulation.
English
9
33
215
60.2K
Haritheja أُعيد تغريده
Ritvik Singh
Ritvik Singh@ritvik_singh9·
Our latest work performs sim2real dexterous grasping using end-to-end depth RL.
English
14
49
395
49.5K
Haritheja أُعيد تغريده
Hello Robot
Hello Robot@hellorobotinc·
We had an incredible time showcasing everything Stretch 3 is capable of at the AI for Good Summit! It was a pleasure to be joined by the talented team from the NYU GRAIL lab, who demonstrated their cutting-edge work on Robot Utility Models. learn more at: robotutilitymodels.com
English
0
6
25
7K
Haritheja أُعيد تغريده
Hello Robot
Hello Robot@hellorobotinc·
The Hello Robot team is ready to go at the AI for Good Global Summit! Excited to connect with innovators from around the world and share how Hello Robot is building useful, inclusive robots that make a real difference. Can’t wait to get underway! #AIforGood #HelloRobot @AIforGood
Hello Robot tweet media
English
5
2
18
1.4K
Haritheja أُعيد تغريده
Jiafei Duan
Jiafei Duan@DJiafei·
Robot collecting robot data. Having fun with this awesome data collection tool from @notmahi ! *Auto reset at the end.
English
0
6
63
4.1K
Haritheja أُعيد تغريده
Raunaq Bhirangi
Raunaq Bhirangi@Raunaqmb·
Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.🧶⬇️
English
5
27
227
31.8K
Haritheja أُعيد تغريده
Haoyu Xiong
Haoyu Xiong@Haoyu_Xiong_·
Your bimanual manipulators might need a Robot Neck 🤖🦒 Introducing Vision in Action: Learning Active Perception from Human Demonstrations ViA learns task-specific, active perceptual strategies—such as searching, tracking, and focusing—directly from human demos, enabling robust visuomotor policies under visual occlusions. 🧵👇
English
18
93
432
120K
Haritheja أُعيد تغريده
Venkatesh
Venkatesh@venkyp2000·
Making touch sensors has never been easier! Excited to present eFlesh, a 3D printable tactile sensor that aims to democratize robotic touch. All you need to make your own eFlesh is a 3D printer, some magnets and a magnetometer. See thread 👇and visit e-flesh.com
English
8
88
555
69.1K
Haritheja أُعيد تغريده
Lerrel Pinto
Lerrel Pinto@LerrelPinto·
Teaching robots to learn only from RGB human videos is hard! In Feel The Force (FTF), we teach robots to mimic the tactile feedback humans experience when handling objects. This allows for delicate, touch-sensitive tasks—like picking up a raw egg without breaking it. 🧵👇
English
18
84
534
70.1K
Haritheja أُعيد تغريده
Lerrel Pinto
Lerrel Pinto@LerrelPinto·
Imagine robots learning new skills—without any robot data. Today, we're excited to release EgoZero: our first steps in training robot policies that operate in unseen environments, solely from data collected through humans wearing Aria smart glasses. 🧵👇
English
6
55
314
42.4K