Learning and Intelligent Systems (LIS) @ MIT

133 posts

Learning and Intelligent Systems (LIS) @ MIT banner
Learning and Intelligent Systems (LIS) @ MIT

Learning and Intelligent Systems (LIS) @ MIT

@MIT_LISLab

PIs: Leslie Pack Kaelbling, Tomás Lozano-Pérez AI/ML/Robotics Research @MIT_CSAIL Website: https://t.co/TgsPKmvPCb

加入时间 Kasım 2020
260 关注1.8K 粉丝
Learning and Intelligent Systems (LIS) @ MIT 已转推
Lucy Cai
Lucy Cai@LucyCai9·
Imagine you told a robot to "find your car keys" in your apartment and it looked around, opened a drawer, and retrieved them for you. As a step towards that, I adapted TiPToP to run on the RBY1 humanoid in our lab! Here's an example instruction it follows: "Put the green block on the blue plate and the yellow block on the magazine." TiPToP helps plan over the right arm + single torso joint, but it's easy to unlock more joints -- even the base wheels -- for more expressive, real-world tasks. Humans find objects without thinking twice. One day, robots will too! 🤖
Nishanth Kumar@nishanthkumar23

State-of-the-art robot policies often need hundreds of hours of data. What if we needed none? Introducing TiPToP: a manipulation system that zero-shots open-world tasks from pixels and language using vision foundation models and GPU-parallelized Task and Motion Planning (TAMP).

English
4
13
89
14K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Robots Digest 🤖
Robots Digest 🤖@robotsdigest·
Everyone is scaling VLAs with more robot data. TiPToP shows another path. No robot training, no policy learning. Just RGB + language → 3D scene → GPU TAMP planner → trajectory. Foundation models + planning alone can run real manipulation tasks.
English
2
26
143
10.2K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Anirudha Majumdar
Anirudha Majumdar@Majumdar_Ani·
It was a pleasure to be back at @MIT to present at the #Robotics Seminar! Great to see all the exciting work happening there. Thanks so much @GioeleZardini for hosting me!
Anirudha Majumdar tweet media
English
3
5
80
7.6K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Ken Goldberg
Ken Goldberg@Ken_Goldberg·
Data Flywheel -> Data Avalanche: Thx to Leslie Kaebling and @Pulkitology Agrawal for suggesting "avalanche" as a better metaphor than "flywheel" for combining model-free + model-based methods to bootstrap a specific robot task to bootstrap & amplify on-policy data collection.
Ken Goldberg tweet media
English
2
2
43
4.6K
Learning and Intelligent Systems (LIS) @ MIT
Check out new work from @nishanthkumar23 in the group on learning and planning with symbolic world models!
RoboPapers@RoboPapers

Reasoning over long horizons would allow robots to generalize better to unseen environments and settings zero-shot. One mechanism for this kind of reasoning would be world models, but traditional video world models still tend to struggle with long horizons, and are very data intensive to train. But what if instead of predicting images about the future, we predicted just the symbolic information necessary for reasoning? @nishanthkumar23 tells us about Pixels to Predicates, a method for symbol grounding which allows a VLM to plan sequences of robot skills to achieve unseen goals in previously unseen settings. To find out more, watch episode #44 of RoboPapers with @micoolcho and @chris_j_paxton now!

English
0
0
1
476
Learning and Intelligent Systems (LIS) @ MIT
We're excited for #RLC2025! If you're at the conference, be sure to catch our PI Leslie Kaelbling's keynote on "RL: Rational Learning" from 9-10 in CCIS 1-430. Leslie will talk about some new perspectives + exciting new results from the group: you won't want to miss it! 🤖
English
0
5
13
1.3K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Zhutian (Skye) Yang
Zhutian (Skye) Yang@ZhutianYang_·
#ICRA2025 🤖 I spent 3 years of PhD making efficient long-horizon manipulation planning algorithms. VLMs ultimately provide the essential common-sense and horizon-reduction benefits. ❗VLMs can generate plausible robot task plans, but actions may not be feasible for robots due to reachability and obstacles. 🤝 So we use a TAMP planner to take in the next VLM-generated subgoal and plan for actions and trajectories. 📈 VLM-TAMP successfully solves cooking problems that require 30-50 actions and interact with 20+ objects. Excited to present it on Thursday! zt-yang.github.io/vlm-tamp-robot/
GIF
English
5
51
415
28.9K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Nishanth Kumar
Nishanth Kumar@nishanthkumar23·
Curious to hear about creating generalist robots from leaders in the field? Don’t miss our panel “Representations for Generalist Robots” (4-5pm) @corl_conf LEAP workshop! Feat. @chelseabfinn @animesh_garg Vincent Vanhoucke @Marc__Toussaint @sidsrivast and Leslie Kaelbling!
Nishanth Kumar tweet media
English
0
6
29
2.8K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Conference on Robot Learning
Incredible insights from Prof. Tomás Lozano-Pérez of MIT during his keynote on the evolution of robotics! He took us on a journey through the shifting landscape of robotics over the years, discussing “inverted pendulum” theory of Robotics. #CoRL2024 #RobotLearning #AI #Robotics
Conference on Robot Learning tweet mediaConference on Robot Learning tweet media
English
1
4
46
3.7K
Learning and Intelligent Systems (LIS) @ MIT 已转推
Yichao Liang
Yichao Liang@yichao_liang·
How can we get VLMs to help robots solve complex long-horizon tasks? Introducing VisualPredicator: an agent that leverages VLMs to learn predicates and operators for classical planners. Our system can stack blocks, balance weights on a balance beam, and even pour coffee🦾! [1/9]
Yichao Liang tweet media
English
2
18
71
14.8K