
AUKI Community Update Feb 14 x.com/i/broadcasts/1…
original
2.5K posts

@dyor21
Through the crypto trenches since 2021. Learning something new, everyday.

AUKI Community Update Feb 14 x.com/i/broadcasts/1…

If you didn't catch Robotics Livestream Ep. 2, here's what the panel actually disagreed on - and where they landed. The humanoid question split the room. @broodsugar from @Auki was the most bearish: "I am yet to find a real customer that actually wants to buy or rent a robot with legs." He compared Figure to Magic Leap - $500M raised on a vision that took a decade to even partially materialize. @dabblerer_ from @BitRobotNetwork pushed back gently: humanoids are the most attention-grabbing form factor and the most relatable for people. Specialized robots might ship first, but generalized robots could also find product-market fit if the development catches up. George from @tashiprotocol sided with Nils: "Deploy 10 single-use robots that all talk to each other instead of one humanoid who gets his ass kicked the moment it leaves the lab." On cloud vs. edge, the panel converged fast. Nils was absolute: cloud-controlled robots will never work in production. Customer sites have bad internet and millisecond latency matters for safety. @0xPravar from @nunet_global agreed but added nuance, it won't be pure edge either. The answer is hybrid architecture with a compute layer that doesn't exist yet. Nils coined a term for it: "domain-side compute." Hyper-local compute on the same network as the robot, plugged into the wall. He believes connecting to this will become as common as connecting to wifi. On data, the panel agreed the industry is still early. Nvidia's Sonic model used 700 hours of motion capture. Nils called it less than $100K worth of data. Karen noted it was mostly collected in Nvidia's controlled offices, the real challenge is real-world environments. But both agreed open-sourcing is the right move because it gives teams a foundation to build on rather than reinventing the wheel. On the biggest opportunity nobody's talking about: Nils laid out a thesis that the internet itself needs to grow three new dimensions for physical AI - sensors, spaces, and actuators. George connected it to traffic: autonomous vehicles coordinating at low latency in Beijing would recover the equivalent of pyramid-building time every week. The consensus: the infrastructure layer for physical AI is where the real value is. Hardware will take care of itself. The coordination, compute, and data layers are the bottleneck. Catch the full Robotics Livestream Ep. 2 on YouTube: youtu.be/jUBUnXiaHjU





App-free navigation at a massive electronics fair in Hong Kong by @Auki and @ZapparApp




"We are missing a word for a category of compute that is not quite edge and not quite cloud."- @broodsugar, CEO of @Auki Nils calls it "domain-side compute", hyper-local compute resources, maybe on the same network as you. Not the cloud in the AWS-us-east-1 sense. Something in between: plugged into the wall, close to you, and much smaller than a data center. The math forces this: a Unitree G1 humanoid gets about two hours of battery. Meta's latest internal AR glasses that do full SLAM get about 30 minutes and still weigh twice what a human will tolerate wearing all day. The compute has to move off the device, but it can't move to the cloud because you need sub-8 millisecond latency for AR and sub-millisecond responsiveness for robots interacting with the physical world. @nunet_global validated this with live deployment: a European real estate company is already running AI agents for energy optimization on exactly this kind of domain-side compute setup, a combination of small on-site devices and a nearby local data center. Three forces are driving this: latency requirements that cloud can't meet, data transfer costs that make cloud uneconomical for sensor-heavy systems, and privacy requirements that make customers unwilling to send data to a remote server. Robotic Livestream EP2, watch it on YouTube: youtu.be/jUBUnXiaHjU

- Nils speaking at Harvard XR today: harvardxr.com/2026/program-2… - Announced the exocortex: a harness for our AI agents which maintains shared context and memory over time across Auki, allowing us to increase our intercognitive capacity. To be open sourced for external contributors.

Auki Community Update Apr 10 x.com/i/broadcasts/1…




