Pedro Milcent

170 posts

Pedro Milcent banner
Pedro Milcent

Pedro Milcent

@MilcentPedro

🤖 Bringing robots to life with human data at @DeplaceAI Opinions are my own

Paris, France Katılım Nisan 2025
847 Takip Edilen225 Takipçiler
Sabitlenmiş Tweet
Pedro Milcent
Pedro Milcent@MilcentPedro·
What if Physical AI could fully leverage human videos for model training? 💪➡️🦾 Human demos like the one below hold massive potential, but to unlock that power, we need to extract the relevant information they contain.
English
6
5
95
22.8K
Pedro Milcent
Pedro Milcent@MilcentPedro·
Great event today at the Robotics Symposium (BARS) at @Stanford . Amazing to see how much has evolved just this year.
Pedro Milcent tweet mediaPedro Milcent tweet mediaPedro Milcent tweet mediaPedro Milcent tweet media
English
0
0
7
1.1K
Pedro Milcent
Pedro Milcent@MilcentPedro·
That’s what our Motion2Text API is for: turning human demos into fine-grained language descriptions that guide robotic models and make human data truly useful. Interested in trying out? Get in touch: forms.gle/sSHTPBDVkBtnHh…
English
1
2
10
886
Pedro Milcent
Pedro Milcent@MilcentPedro·
What if Physical AI could fully leverage human videos for model training? 💪➡️🦾 Human demos like the one below hold massive potential, but to unlock that power, we need to extract the relevant information they contain.
English
6
5
95
22.8K
Pedro Milcent
Pedro Milcent@MilcentPedro·
@aesposito__ @IlirAliu_ @DeplaceAI Also see a lot of value in Motion2Text for 3rd person videos, something we will focus more on later. Today, we're working on language to better leverage ego human demos for contact points, trajectory strategies, etc.
English
1
0
2
46
Andrea
Andrea@aesposito__·
@IlirAliu_ @DeplaceAI motion to language to motion seems unnecessary for first-person videos. it might make more sense for third-person, at least until someone cracks video-to-simulation
English
1
0
0
125
Ilir Aliu
Ilir Aliu@IlirAliu_·
You will soon be able to teach robots what human are doing… using natural language. I spoke to one of the founders a couple of months ago on my podcast: An API that takes raw human videos and returns detailed language annotations describing the motion. @DeplaceAI is building something wild: Not just basic labels, it captures: ✅ motion semantics ✅ relative positions ✅ cause and effect ✅ task outcomes ✅ and more It’s built on top of research in point tracking, segmentation, and learning from demos; all to make it easier to train robots and embodied agents without manual labeling. They’re offering early access and sharing some of the datasets they collected via their global network of video collectors. Thanks for sharing, @MilcentPedro ! 🔗 forms.gle/sSHTPBDVkBtnHh… One of the most interesting motion-to-language interfaces I’ve seen. If you’re working in robotics, vision, or LfD, it’s worth checking out.
English
3
4
62
5.1K
Pedro Milcent
Pedro Milcent@MilcentPedro·
@IlirAliu_ @DeplaceAI There’s a lot of info in human demos: object affordances, trajectory strategies, contact points, etc We just have to make it useful for robotics
English
1
0
2
203
Pedro Milcent
Pedro Milcent@MilcentPedro·
✋➡️🤖 human intelligence to robotic intelligence through language
Ilir Aliu@IlirAliu_

You will soon be able to teach robots what human are doing… using natural language. I spoke to one of the founders a couple of months ago on my podcast: An API that takes raw human videos and returns detailed language annotations describing the motion. @DeplaceAI is building something wild: Not just basic labels, it captures: ✅ motion semantics ✅ relative positions ✅ cause and effect ✅ task outcomes ✅ and more It’s built on top of research in point tracking, segmentation, and learning from demos; all to make it easier to train robots and embodied agents without manual labeling. They’re offering early access and sharing some of the datasets they collected via their global network of video collectors. Thanks for sharing, @MilcentPedro ! 🔗 forms.gle/sSHTPBDVkBtnHh… One of the most interesting motion-to-language interfaces I’ve seen. If you’re working in robotics, vision, or LfD, it’s worth checking out.

English
0
0
6
464
Pedro Milcent
Pedro Milcent@MilcentPedro·
What if you could send human videos and get back detailed language descriptions of the motion to guide your Physical AI model? Happy to share more about the Motion2Text API we are building at @DeplaceAI! We are working on natural language descriptions that capture: 📙 Motion semantics 📏 Relative positions ⚙️ Cause and effect 💡 and more Leveraging state-of-the-art research and models in point tracking, semantic generalization, object segmentation, and learning from human demos ✋➡️🤖 This helps robots understand human movements and close the gap between humans & robots to fully leverage human videos to train Physical AI! 👉 If you want early access to the API or to use the datasets we gathered through our network of collectors, let us know, we would love your feedback: lnkd.in/eM5i4Zcc
English
1
0
9
382
Aaron Tan
Aaron Tan@aaronistan·
@MilcentPedro Thank you! Did you like the Parisian architecture in the video
English
1
0
0
1.8K
Aaron Tan
Aaron Tan@aaronistan·
Introducing Lume, the robotic lamp. The first robot designed to fit naturally into your home and help with chores, starting with laundry folding. If you’re looking for help and want to avoid the privacy and safety concerns of humanoids in your home, pre-order now.
English
1.7K
415
6.6K
4.2M
Pedro Milcent
Pedro Milcent@MilcentPedro·
Awesome few days in Paris, @ycombinator event at Sorbonne had great tips for early-stage founders, and the @RaiseSummit was all about the future of AI (Physical AI too!). More tomorrow 🦾 🤖
Pedro Milcent tweet mediaPedro Milcent tweet mediaPedro Milcent tweet media
English
0
0
4
282
NEURA Robotics
NEURA Robotics@NEURARobotics·
What a reveal! 🦾 The new generation of 4NE1 is finally here. Ready to bring safe, intelligent automation to the real world. Thanks to everyone who was there live for this big moment at automatica 2025. More impressions coming soon. Stay tuned!
NEURA Robotics tweet mediaNEURA Robotics tweet mediaNEURA Robotics tweet mediaNEURA Robotics tweet media
English
12
43
291
21K
Stone Tao
Stone Tao@Stone_Tao·
Rest of my lab has various presentations at RSS2025, please check out their awesome work! June 25 EEB 248 @ 10am - Hardware Optimization for In-Hand Rotation by K. Fay June 25 EEB 248 @ 3pm - Towards Embodiment Scaling Laws in Robot Locomotion by @BoAi0110 June 25 OHE 122 @ 11am - ImVR: Immersive VR Teleoperation System for General Purpose by Yulin Liu
English
1
0
22
1.7K
Pedro Milcent
Pedro Milcent@MilcentPedro·
Great paper by @mimicrobotics, showcasing the 16 DoF Faive hand in action and highlighting the importance of #real, diverse, and curated #data for Physical AI models that are performant, generalizable, and capable of self-correction 📈✋ 📄 Check out the paper by @elvisnavah and team here: arxiv.org/abs/2506.11916
English
0
0
3
334
Pedro Milcent
Pedro Milcent@MilcentPedro·
@xiao_ted Would love to watch the session if it's recorded, big momentum in Physical AI data 🦾
English
0
0
0
36
Ted Xiao
Ted Xiao@xiao_ted·
Excited to attend #RSS2025 in LA this coming week! Lots of progress in the field on embracing data-driven robotics, I’m expecting a very different vibe this year. I’ll be giving a talk on Gemini Robotics and debating at the SemRob and RoboEval Workshops, hope to see you there!
English
3
1
22
1.7K
Pedro Milcent
Pedro Milcent@MilcentPedro·
@RoboticsSciSys Amazing to see so many of the top minds in robotics at the USC campus! 🦾🦾
English
0
0
0
28
Robotics: Science and Systems
Robotics: Science and Systems@RoboticsSciSys·
🌟 Day 2 at #RSS2025! 🌟 We just had a super inspiring Early Career Spotlight Talk by Dorsa Sadigh: "From Dirt to Data: How Gardening Taught me about Generalist Robot Policies"! 🤖💡 Stay tuned for more excitements today!
Robotics: Science and Systems tweet media
English
1
0
14
1.4K
Pedro Milcent
Pedro Milcent@MilcentPedro·
@SnehalJauhri Thanks for sharing the recording! Both papers are extremely relevant to scale data collection for robotics 👏👏
English
0
0
0
62
Snehal Jauhri
Snehal Jauhri@SnehalJauhri·
Thank you to all the speakers & attendees for making the EgoAct workshop a great success! Congratulations to the winners of the Best Paper Awards: EgoDex & DexWild! The full recording is available at: youtu.be/64yLApbBZ7I Some highlights:
YouTube video
YouTube
Snehal Jauhri tweet mediaSnehal Jauhri tweet mediaSnehal Jauhri tweet mediaSnehal Jauhri tweet media
Snehal Jauhri@SnehalJauhri

Join us on Saturday, 21st June at EgoAct 🥽🤖: the 1st Workshop on Egocentric Perception & Action for Robot Learning @ RSS 2025 @RoboticsSciSys in Los Angeles! ☀️🌴 Full program w/ accepted contributions & talks at: egoact.github.io/rss2025 Online stream: tinyurl.com/egoact

English
2
7
39
4.7K
Pedro Milcent
Pedro Milcent@MilcentPedro·
@elvisnavah Awesome results, really shows how crucial data collection decisions are for model performance!
English
0
0
0
70
Elvis Nava
Elvis Nava@elvisnavah·
Happy to announce mimic-one: a Scalable Model Recipe for General Purpose Robot Dexterity, the culmination of years of research work in dexterous manipulation with imitation learning.
English
7
36
207
20.1K