

Dimensional
71 posts

@dimensionalos
The open-source framework for agentic robotics. Program atoms, not bits. https://t.co/TqYXsMYBDV




Fake bait. Language agents do not understand physical world. You need something fundamentally different.





2/ We’ve spent months building the infra to allow agents to communicate with any hardware over a custom, pip-installable transport layer. This is how we cover 80% of robots (unitree, deep, agibot, galaxea, agileX) and most drone platforms already!





So @dimensionalos did a thing. They sponsored this hackathon named HackNation with the brightest nerds, and man did those guys impose a crazy quest. The problem we fixed? You have 300 humanoids in your factory, and an EEG brain device on your head. Ideally, you would want to control them all from your own thoughts - because one controller for each robot is just not scalable. Literally imagine clenching your right fist to turn right, both fists to walk, tap your tongue to shift gears. Yes, you read that well - pure thought to robot movement. And that’s Kinexus, a dashboard that visualizes your brain signals in real time, converts them into humanoid movements, and maps the entire factory environment. You could also just say “pick the box from the conveyor and place it on Pallet 2” and the humanoid autonomously navigates, grabs, walks, and releases. AGI we’re coming for you, github repo + cool pics in the first comment
