EIDON AI

573 posts

EIDON AI banner
EIDON AI

EIDON AI

@eidon_ai

Decentralized AI | Frontier Robotics | Next-gen Embodied AI with real-world data

(0,0,0,-ct) Katılım Mart 2024
206 Takip Edilen3.4K Takipçiler
Sabitlenmiş Tweet
EIDON AI
EIDON AI@eidon_ai·
Decentralised Robotics 🧵 I/ Building datasets for embodied AI is tough—humanoid robots need real-world human motion task data, but collecting it at scale has been limited to research lab projects or closed source big labs. At Eidon, we started with our wearable IMU trackers. Here's what we've built and achieved so far.
English
11
52
319
2.5M
EIDON AI
EIDON AI@eidon_ai·
🌅🤖 JUST PASSED 1000 HRS OF AI ROBOTICS DATA COLLECTION (synchronized arm tracking + egocentric video). We continue to scale to 100k+hrs and include our GLOVE tech for dexterity. The full sashimi SET: glove + arm trackers + egocentric video + Sym (simulation env). 🍣 2026 is peak of the AI Robotics data collection race. DM for sample data and godspeed to 1M hrs
EIDON AI tweet mediaEIDON AI tweet media
English
14
12
28
1.2K
EIDON AI
EIDON AI@eidon_ai·
the trenches in AI robotics data collection in 2025-26
EIDON AI tweet media
English
17
8
34
974
Adrian Macneil — 🤖/acc
Adrian Macneil — 🤖/acc@adrianmacneil·
🚀 We just raised $40 million to build infrastructure for Physical AI! 🦾 AI is rapidly transforming critical industries like manufacturing, logistics, transportation, agriculture, construction, aerospace, and defense. Teams that win in the physical world are those who can create a data flywheel, leveraging infrastructure to capture, ingest, analyze, and evaluate the vast quantities of data generated by real-world systems. Robotics data is multimodal, time-synchronized, and bandwidth‑constrained at the edge. Traditional data and observability platforms were only designed to store and query text and time-series data, not petabyte-scale 3D, video, audio, GNSS, and proprioceptive data. The ability to efficiently capture, ingest, search, visualize, and evaluate multimodal data is critical to Physical AI development. Foxglove is a modern data engine for Physical AI, enabling you to record logs or capture demonstrations at the edge, sync recordings to the cloud or on-premises storage, find critical events across petabytes of data, evaluate robot performance, and watch a 3D frame-by-frame replay using our advanced visualization tool. 👉 Today is still Day 1 for Physical AI, and we're hiring for dozens of roles to assemble the best team in the industry. If you've built ML platforms, data infrastructure, dataset curation, evaluation and validation, or visualization tools at a leading robotics or autonomous vehicle company, let's chat – drop me a note or tag a friend below and I'll follow up personally! Thank you to @AlexandraSukin and @jeremyl at @BessemerVP, @Sethwinterroth at @EclipseVentures, @dbeyer123 and @dhaliwas at @AmplifyPartners, and @IcehouseVenture for joining us on this mission. Also a special shoutout to our angels @tobi @alexgkendall @kvogt @_milankovac_ @hmehanna @pabbeel @BradPorter_ @bsofman @kevinmpeterson1 @ChrisWalti @Lindon_Gao @danielkan @AdamDraper @FEhrsam and @karrisaarinen!
English
44
35
243
46.1K
ali
ali@aliuahma·
eddy and the team @builddotai are sincere in their desire to push the frontier of robotics forwards. i began doing competition robotics against/with them 6 years ago. we love robots, and we want to see robots in the real world doing useful things. their contributions will help us do better science and get closer to a future where robots handle the mundane and dangerous tasks
Eddy Xu@eddybuild

today, we’re open sourcing the largest egocentric dataset in history. - 10,000 hours - 2,153 factory workers - 1,080,000,000 frames the era of data scaling in robotics is here. (thread)

English
3
1
43
6.9K
EIDON AI
EIDON AI@eidon_ai·
@simonkalouche Eidon makes a step towards that by adding arm tracking data synchronised to the POV video. next is fingers tracking with Eidon Gloves.
English
0
0
1
712
Simon Kalouche
Simon Kalouche@simonkalouche·
Massive dataset but egocentric video is not going to get us dexterous manipulation policies. Egocentric data provides high-level semantic scene and task understanding (which frontier VLMs already generally provide). What is needed is fine sub-mm-level finger pose & force data.
Eddy Xu@eddybuild

today, we’re open sourcing the largest egocentric dataset in history. - 10,000 hours - 2,153 factory workers - 1,080,000,000 frames the era of data scaling in robotics is here. (thread)

English
20
15
220
39.2K
Talia Goldberg
Talia Goldberg@TaliaGold·
Robotics has been VC's favorite way to lose money. ~10 cents returned for $1 invested over the past decade. Hardware is hard, they said. They were right. At our partner offsite, I presented on Physical AI. "This time is different." Famous last words! But converging tailwinds are rewriting the equation: powerful VLMs, edge compute, lower cost hardware, and top talent commercializing breakthrough research. Publishing excerpts from the internal presentation w/ @AlexandraSukin @bhavikvnagda Why this time is actually different, what we're looking for, what we're avoiding. Founders, hit us up!
Talia Goldberg tweet mediaTalia Goldberg tweet mediaTalia Goldberg tweet mediaTalia Goldberg tweet media
English
43
49
679
97.2K
EIDON AI
EIDON AI@eidon_ai·
IX/ If robotics is AI's endgame, a decentralized version is essential — thousands of hours of synchronized data from diverse contributors will power it. We've started with our wearable IMU trackers, capturing upper-body kinematics in real environments. There is so much more to do.
English
1
0
9
1.3K
EIDON AI
EIDON AI@eidon_ai·
VIII/ Next: extending our data collection with Eidon Gloves and Glasses to capture fingers and dexterity data too. There is a long road of solving practical problems tho.
EIDON AI tweet mediaEIDON AI tweet mediaEIDON AI tweet media
English
3
1
12
1.4K
EIDON AI
EIDON AI@eidon_ai·
Decentralised Robotics 🧵 I/ Building datasets for embodied AI is tough—humanoid robots need real-world human motion task data, but collecting it at scale has been limited to research lab projects or closed source big labs. At Eidon, we started with our wearable IMU trackers. Here's what we've built and achieved so far.
English
11
52
319
2.5M