OpenMind

990 posts

OpenMind banner
OpenMind

OpenMind

@openmind_agi

Superintelligence for robots.

San Francisco, CA Katılım Temmuz 2024
89 Takip Edilen159.4K Takipçiler
OpenMind
OpenMind@openmind_agi·
Our Unitree Go2s are getting better at navigating real-world clutter. In this demo, it can now self-detour from corners and avoid obstacles using adjusted MPPI parameters. Fully autonomous navigation means being able to move without bumping into potentially critical objects. In workplaces and homes this is especially important.
English
6
29
196
6.1K
OpenMind
OpenMind@openmind_agi·
Our build night was a huge success! Thank you to all the technical teams for showing up and launching on OM1. Winners received exclusive OpenMind backpacks and showcased their amazing demos including: - Robot that mirrors human motion in real time - New robot form-factor abstracted and automated in just 30 minutes - Autonomous drones controlled in simulation by natural language It will only get easier to build applications from here.
OpenMind tweet mediaOpenMind tweet mediaOpenMind tweet mediaOpenMind tweet media
OpenMind@openmind_agi

OpenMind OM1 Build Night w/ @OpenAI Codex Location: San Francisco [Address given upon successful RSVP] Date: Wednesday, May 6 @ 4:30 PM - 9:00 PM ​This event is for robotics and agent developers, AI-native builders, technical founders, and curious engineers who want a practical way to learn OM1 by actually building with it. ​Bring a laptop and come ready to ship something. Register: luma.com/openmind-om1-r…

English
12
50
380
23.4K
OpenMind
OpenMind@openmind_agi·
All roboticists - in case you are building VLAs, please stop what you are doing, take a break, get a coffee, and read the LeWorldModel paper by @lucasmaes_, @randall_balestr, @ylecun and collaborators: arxiv.org/abs/2603.19312. And then read it several more times - the same general approach can be directly mapped to other key problems in robotics, including dealing with multimodal inputs such as vision and speech
English
27
27
234
10.1K
OpenMind
OpenMind@openmind_agi·
Unitree Go2* robot dog
English
1
0
12
2.2K
OpenMind
OpenMind@openmind_agi·
Bots, Bevs & Devs was a blast! Huge thanks to everyone who came out to Circuit Launch in Oakland for an evening of robotics, AI demos, short talks, and great conversations. OpenMind had the chance to demo our Unitree B2 robot dog running OM1, and it was awesome seeing people interact with the future of embodied intelligence up close.
OpenMind tweet mediaOpenMind tweet mediaOpenMind tweet media
English
7
39
335
11.7K
OpenMind
OpenMind@openmind_agi·
Thank you to @itsajchan and HackerSquad for organizing this event alongside us!
English
2
0
45
3.5K
OpenMind
OpenMind@openmind_agi·
OpenMind OM1 Build Night w/ @OpenAI Codex Location: San Francisco [Address given upon successful RSVP] Date: Wednesday, May 6 @ 4:30 PM - 9:00 PM ​This event is for robotics and agent developers, AI-native builders, technical founders, and curious engineers who want a practical way to learn OM1 by actually building with it. ​Bring a laptop and come ready to ship something. Register: luma.com/openmind-om1-r…
OpenMind tweet media
English
8
46
356
31.9K
OpenMind
OpenMind@openmind_agi·
Here's the recap of our Robotics Intelligence Seminar at Stanford Research Institute! We had a packed room with amazing speakers, panelists, and robotics enthusiasts. See you at our next event where we'll continue to bring industry thought-leaders together.
English
52
43
305
16.9K
OpenMind
OpenMind@openmind_agi·
We were delighted to participate at CONNECT 2026: Global Embodied AI Innovation Summit where we formalized our strategic partnership with @MagicLab_Robot. We also shared our thoughts in the panel: Perception, Foundation Models & Decision-Making. Excited to see continued collaboration in the space as we expand our reach across multiple robot manufacturers.
OpenMind tweet mediaOpenMind tweet media
English
18
32
308
10.8K
OpenMind
OpenMind@openmind_agi·
Why Generic Humanoid Robots Will Fail — And What's Next Imagine an alternate world where we never invented the car. In that world, a robotics engineer might reasonably conclude that robotic horses are the future — replace the living ones, keep the stables and saddles, ride them to work. Convenient, modern, and the roads stay free of manure. It sounds absurd only because you already know about cars. We keep making the same mistake with humanoid robots. Consider transportation. To finally make driving safe, we had two options: put a humanoid in the driver's seat, or embed sensing and compute directly into the vehicle. Waymo chose the latter. It has no steering wheel. It exists purely to move people efficiently from A to B. The humanoid was not needed. Consider a sock factory. Yes, you could replace workers with humanoid robots one-for-one on the assembly line — and gain maybe 2-3x efficiency. Or you could completely redesign the workflow around a purpose-built autonomous sewing system and eliminate most of the factory, the chairs, the cafeteria, the manual sewing machines, the HVAC, the doors, and the restrooms. The actual optimization is to side-step the previous human-imposed physical constraint. Look at Ukraine. The front lines aren't filling up with Terminator-style humanoids carrying rifles. Human soldiers are being replaced by heterogeneous swarms of purpose-specific drones: some for reconnaissance, some for logistics, some for delivering munitions. War is being restructured around the desired outcome (survival), not the soldier's shape. Consider a 1970's office. Want to move information through teams of people? We once used typists, paper, trucks to supply the paper, typewriters, and repair technicians. A linear improvement would have been to replace the human typist with a 10-fingered humanoid. What actually happened? The entire workflow — paper, printers, typewriter factories, delivery trucks, the desks, the offices — was obliterated. Email deleted the human clerk's entire universe. Consider cancer early detection by mammography. Today, getting a mammogram requires expensive hardware, logistics infrastructure, human nurses and doctors, a biopsy workflow, a human pathologist with a microscope (imported from Germany or Japan), a written finding, multiple physician reviews. Sure, you could replace the pathologist with a humanoid (the microscope focus knob requires finger dexterity) and get a modest efficiency gain (and faster responses at 2 am). Or — the far more likely future — we all swallow a cancer detection pill every few months, and 24 hours later a color-changing sticker on our arm turns red or green. No hardware. No hospital. No logistics. No pathologist. No office. No desk. No humanoid. The workflow isn't optimized by a literal drop-in swap of a human pathologist for a humanoid. The entire workflow simply ceases to exist. Consider life sciences research and drug development. We're seeing excitement about robot arms and humanoids pipetting water in research labs. Robot horses, episode 7. We don't design aircraft by crashing test planes — we simulate them entirely in software first. Biology will go the same way. The path to scalable drug discovery isn't robot arms in conventional wet labs demonstrating 10 fingered prowess in manipulating Eppendorf tubes filled with purple food coloring. Rather, we need in-silico biological models that evaluate billions of hypotheses computationally, with physical manipulation of atoms only at the very end. The clear pattern. Efficient automation doesn't try to replicate a 10-fingered human in a static context. Automation eliminates physical rate-limiting steps in their entirety. That's why "classical" humanoid robots, as a generic category, will largely fail. They're robotic horses. They assume the infrastructure and workflows stay fixed and only the 10-fingered human is swapped out. That's not how economic and technological pressure works. What actually matters? If humans continue to inhabit the physical world, then moving atoms will remain important, and that requires five things: atoms, energy, force generation and actuation, sensing, and compute. Everything else — form factor, number of limbs, type of end effector — is a variable to be optimized for the task. So if you are a pathologist, a robotics engineer, a teacher, a parent, a politician, or a sewing factory owner - please think different. Most obviously, we should all anticipate, and build for, a future in which robots exhibit extreme physical fluidity: Two arms or four. Wheels or legs. Tentacles or flippers. Three fingers or twelve, or none at all. Eyes at the front, side, or tip of a tentacle. At OpenMind, we don't care what you look like right now - we got you, in all your physical form factors. OM2 ships in July, for all machines. Let's build.
English
16
44
252
1.3M
Glen Gilmore | #AIWeek26 🇮🇹
Teaching robots to read human movement and intent in real time. Critical for safer human-humanoid interaction at scale. Opening new frontiers for AI governance and privacy. 🎥 @openmind_agi
English
4
6
32
2.3K
OpenMind
OpenMind@openmind_agi·
From gesture to intent, our latest work shows how advanced keypoint detection (body + hands) can unlock powerful, real-time action recognition. We’re pushing the boundaries with both data-driven models and zero-shot approaches, scaling from a handful of core actions to a richer set of human behaviors without always needing new training data. This is a glimpse into more adaptive, intelligent systems that understand people naturally and is vital for mass robot adoption.
English
21
34
310
12.9K
OpenMind
OpenMind@openmind_agi·
Demonstrating our latest localization system in action. At bootup, the robot has no prior position estimate, but within seconds it autonomously localizes itself using a fusion of three algorithms. Unlike our previous version, this updated system incorporates vision, allowing the robot to adapt to real-world changes (e.g., moved furniture) by recognizing previously seen environments. In this demo, we repeatedly reset navigation and reposition the robot to random locations, showing robust, repeatable localization. The robot then executes a full patrol, following a planned path (visualized in RViz) with real-time path tracking. Because our software is hardware-agnostic, it brings the same reliable performance to any robot it runs on.
OpenMind@openmind_agi

Our CTO @boyuan is demoing our new localization algorithm, which surpasses leading industry solutions. Localization enables a robot to determine its position in an environment, which is essential for navigation. While most systems require robots to start from a predefined location, ours doesn’t. It lets robots boot up anywhere and immediately locate themselves, making deployment far more flexible for real world scenarios.

English
19
30
223
12.3K
OpenMind
OpenMind@openmind_agi·
4/ Speaker Release: Keynote Speech: Introducing the International Humanoid Robotics Standardization Consortium Speaker: Brian Koo from @LiveX_ai
OpenMind tweet media
English
1
2
22
2.5K
OpenMind
OpenMind@openmind_agi·
3/ Speaker Release: Where Robots Delivers Real Value Esteemed Speakers: Steve Cousins from The Stanford Robotics Center Grace Brown @Grace_JBrown from Andromeda Robotics Gloria Tzou, Health & Tech, formerly AWS, Computer Vision at Columbia
OpenMind tweet media
English
1
3
37
3K