
Tom Silver
410 posts

Tom Silver
@tomssilver
Assistant Professor @Princeton. Developing robots that plan and learn to help people.


New preprint on learning abstract world models for robotics planning. Paper + code below. 🤖🌐 Must an agent plan by simulating pixels frame by frame, or can it think in abstractions? Consider planning an international flight: we can reason about buying tickets, changing airplanes, and crossing borders without committing to the color of the airplane or the milliseconds before takeoff. Absent abstraction, planning over long time horizons would be intractable, because every minute detail of the world would need to be simulated. [1/7]



State-of-the-art robot policies often need hundreds of hours of data. What if we needed none? Introducing TiPToP: a manipulation system that zero-shots open-world tasks from pixels and language using vision foundation models and GPU-parallelized Task and Motion Planning (TAMP).






Our new paper develops robots that don’t just complete tasks: they anticipate how their actions impact what comes next. Example: when putting objects away, organizing them neatly isn’t just aesthetic—it makes future retrieval faster and easier. 📝🧵👇



