@AntonioSitongLi Tell me if I am write:
Does it like, time how long the servo is to get to its position, and then if in that time its not there, based on where it *is*m it calculates the grip force?
Scientists have developed a framework that provides vision-language models with a library of information about object shapes and orientation, boosting spatial awareness for more precise robotic manipulation.
Learn more in Science #Robotics: scim.ag/4cN2r3m
🧵In the world of physical AI, we’re used to seeing videos sped up to hide how slow robots actually are.
@EkaRobotics is doing the exact opposite.
They’re showing their robots at 1/25x speed because the movement is so fast and the force-sensitivity so precise, you’d miss the "superhuman" dexterity in real-time.
First training run of our intern with the ACT policy. Okay-ish, but not fully working yet.
Specs:
- 100 samples
- 10k steps
- Batch size 16
- Chunk size 48 (50 hertz loop)
- Action steps 30
Claude now connects to the tools creative professionals already use.
With the new Blender connector, you can debug a scene, build new tools, or batch-apply changes across every object, directly from Claude.
It's a 3D printer, and 3D assembly station! 🖨️
The Functgraph developed at Meiji University starts as a regular 3D printer but upgrades itself into a mini factory. It can print parts for its own tools, pick them up, clean them, and put them together, all by itself.
Think of it like a robot that can 3D print a spatula, assemble it, and then use it to flip pancakes. 🥞
Instead of just printing objects, the Functgraph can actually perform physical tasks, like folding laundry or slicing vegetables, by using printed and assembled tools.
It’s a step toward the idea of a robot that can download "apps" as physical skills, much like your phone downloads software! 👀
~~
♻️ Join the weekly robotics newsletter, and never miss any news → ziegler.substack.com