Antioch

22 posts

Antioch banner
Antioch

Antioch

@antiochrobotics

The simulation platform for physical autonomy.

New York, USA Katılım Aralık 2025
13 Takip Edilen97 Takipçiler
Antioch
Antioch@antiochrobotics·
Antioch is bringing agentic development to physical autonomy. The next wave of AI will transform physical industry — manufacturing, logistics, construction — the way LLMs have changed knowledge work. To unlock this shift, every autonomy team needs the same closed-loop, agentic development infrastructure that tools like Cursor have brought to software. Last week, we announced Antioch’s $8.5M seed raise to build this. Read about the full vision, the team behind it, and what’s next: antioch.com/blog/seed/
English
1
2
19
1.7K
Antioch
Antioch@antiochrobotics·
We’ve raised $8.5M in seed funding, valuing Antioch at $60M, to build the simulation platform for physical autonomy. The raise was led by @Category_VC and @A_StarVC, with participation from @MaCVentureCap, @AbstractVC, @BoxGroup, @IcehouseVenture, and angels. Antioch brings robotics and autonomy development entirely into simulation. Teams build, iterate, and test their full autonomous stacks in a single platform, accelerated by agents with native understanding of 3D environments. We're doing for physical autonomy what coding agents have done for software development. When the entire development loop lives in simulation, engineers and agents can reason about a robotic stack the same way they reason about a codebase: run it, observe the result, and iterate. The funding will go toward further expanding the engineering team, the core simulation infrastructure, and our new agentic framework. TechCrunch covered the raise and what we’re building: techcrunch.com/2026/04/16/thi… If you’re an exceptional engineer interested in redefining autonomy development, we’d love to chat: antioch.com/careers.
Antioch tweet media
English
4
5
46
6.6K
Antioch retweetledi
Harry Mellsop
Harry Mellsop@HarryMellsop·
anyone know if this is grounds for a complaint to @peta ?
English
0
3
8
665
Antioch
Antioch@antiochrobotics·
If you're iterating widely-deployed autonomous systems, let's talk: antioch.com
English
0
0
2
96
Antioch
Antioch@antiochrobotics·
The obvious case for simulation is developing a new system faster. The case that gets less attention — and that we've come to appreciate deeply working with teams shipping at scale — is simulation against autonomous systems that are already widely deployed. Once a system runs at scale, the calculus of change inverts. Retuning a control loop deep in the stack sounds minor, but it has to be validated across the full range of conditions the product has met in the field. That cost is often prohibitive, so the change doesn't ship. This loss doesn't show up on any dashboard. A 5% detection lift no one can validate, or a firmware tuning pass that would benefit the entire installed base, sits on the shelf. The real cost of leaving a working system alone is the opportunity cost of improvements that never get made. Running the full stack in sim changes what's possible. When the actual production software runs on top of a high-fidelity hardware model, changes to a deployed system surfaces its regressions in sim the same way it would in the field. We’re building Antioch for this. Production code runs in sim the same way it runs on the hardware, so a change validated in sim carries over to the field.
English
2
3
6
491
Antioch
Antioch@antiochrobotics·
Most teams pick their simulation engine carefully and then run whatever sensor model ships with it. There's a lot of other infrastructure to stand up, so sensor fidelity can get deferred. But cameras, LiDAR, infrared, and radar sit upstream of everything else in the sim stack. When they're not modeled accurately, perception, planning, and control all train and test against inputs that don't match what the hardware actually sees in the field. The difficulty is that real sensors don't behave like their spec sheets. They have noise profiles, saturation curves, beam patterns, and failure modes that vary by unit, by temperature, by firmware version. A simulated LiDAR returning clean point clouds at uniform density bears little resemblance to the sparse, noisy returns your hardware produces at range. Most physics engines don't ship sensor models that account for any of this, and building them from scratch is a months-long detour from the product work teams are actually trying to do. That's why Antioch invests so heavily in sensor fidelity. We’ve built up a robust library of sensor models and a calibration pipeline that can match specific physical hardware quickly and reliably, so teams can skip the detour and trust their sim results from day one. We’ve hardened these sensor models across a breadth of hardware and operating environments that no single autonomy team would test against. When we close the sim-to-real gap in one context, the model improves for everyone using it. We build the simulation platform. Sensor modeling is one layer of it, but it's the layer that determines whether everything else is worth trusting.
Antioch tweet media
English
1
3
4
763
Antioch
Antioch@antiochrobotics·
ABB claims 99% correlation between simulation and real-world robot behavior, in large part because their virtual controller runs the same firmware as the physical robot. That architectural choice matters. If a simulated robot executes the same control code as the physical one, the sim-to-real gap narrows to physics modeling and sensor fidelity rather than software behavior divergence. This is the same principle we build on at Antioch. ABB does it for industrial robot arms by running identical controller firmware on both sides. We do it for full autonomous stacks: the actual production software (perception, planning, controls) runs inside the simulation unchanged, not a simplified stand-in. The sim-to-real gap shrinks when you stop approximating the software and start testing the real thing. Read the ABB/NVIDIA announcement here: blogs.nvidia.com/blog/abb-robot…
Antioch tweet media
English
1
6
6
482
Antioch
Antioch@antiochrobotics·
The simulation stack for physical autonomy is moving faster than most teams can keep up with. New physics engines, generative world models, and synthetic data tools are shipping every month. Wiring each new tool into a sim workflow is hard enough, and keeping your stack current as the ecosystem shifts is harder still. Software had this problem when AI models started proliferating faster than any team could integrate. Cursor solved it by becoming the platform layer that absorbed the pace of model innovation and made it immediately usable to engineers. Physical autonomy needs the same thing. The simulation primitives exist, but what's missing is the platform that composes them into scalable testing infrastructure and keeps up with the ecosystem so individual teams don't have to. That's what Antioch is building. Teams onboard their robot once, and as the state-of-the-art evolves, their simulation infrastructure evolves with it.
English
0
3
5
191
Antioch
Antioch@antiochrobotics·
Sneak peek of the Antioch simulation platform, featuring Spot. Create a digital twin of your robot, stream telemetry, and run thousands of scenarios testing its behavior, entirely from your browser. We’re giving every team access to state of the art sim, no GPU config required. If you’re building in the autonomy space, let’s chat: #contact" target="_blank" rel="nofollow noopener">antioch.com/#contact
English
0
7
15
619
Antioch retweetledi
Harry Mellsop
Harry Mellsop@HarryMellsop·
China's structural advantage in autonomy isn't talent or capital. It's manufacturing scale. More factories. More real-world data. More test volume. To be competitive in the west, we need to lean on our key strength: software. Simulation, world models, and synthetic data are our advantage, and our answer. The problem: harnessing these tools for robotics is genuinely hard. We built the platform to make that answer actually work. 🧵
English
1
1
3
108
Antioch retweetledi
Michael Calvey
Michael Calvey@michaeljcalvey·
One surprising learning from bringing simulation-driven validation to autonomy teams: There's a LOT of testing companies and engineers wish they could do but can't today. A high cost of mistakes + low release confidence = slower launch cadence and weaker launches. You're a lot less likely to ship feature improvements when there's a real risk of bricking ten million customer units. Testing in simulation totally flips this dynamic. The feedback loop gets way shorter and teams can suddenly test a wide variety of regression scenarios and edge cases in parallel. More release confidence = faster dev cycles, confident tradeoff decisions, and ultimately better sleep at night.
English
0
3
4
89
Antioch
Antioch@antiochrobotics·
Simulation should be as foundational to physical autonomy as CI/CD is to software, but the barrier to entry is absurdly high. We’re writing about how to fix that. We launched a blog to discuss simulation infrastructure, platform engineering, robotics software, and how real teams put sim into practice. First post is live. antioch.com/blog/hello-wor…
English
0
3
5
142
Antioch
Antioch@antiochrobotics·
Real-world testing as the primary feedback loop doesn’t scale. Antioch is building the simulation infrastructure to replace it. If your team is shipping autonomous systems, let’s talk. antioch.com
English
0
0
0
60
Antioch
Antioch@antiochrobotics·
What’s missing is the development and testing infrastructure. Most teams are still only testing in the field. Multi-week turnarounds, high costs, and missed edge cases present a massive bottleneck in the development cycle.
English
1
0
1
61
Antioch
Antioch@antiochrobotics·
We’re at the beginning of a physical AI gold rush. Last week, NVIDIA GTC made it clear: we have the compute, model capability, demand, and financing for scaled deployment of autonomous systems.
English
1
3
5
419