Ben

45 posts

Ben banner
Ben

Ben

@shinhan2000

Chef tech _ @strikerobot_ai / master degree in AI

Katılım Aralık 2025
28 Takip Edilen133 Takipçiler
Ben retweetledi
Frank
Frank@0xFrankEth·
Strike Robot x Venice: Privacy-Focused AI Infrastructure for Robots @StrikeRobot_ai faced one of the most critical problems when building an AI infrastructure for autonomous robots: which inference engine could run fast, secure, and uncensored? A solution was needed! As everyone knows, the answer was found with @AskVenice. Venice is a privacy-focused and uncensored AI inference platform. Strike Robot chose this platform as the primary inference backend for SR Agentic and SR Platform. This partnership isn't just a technical integration, my friends; Venice is also providing credit sponsorship and co-developing with Strike Robot – a huge step forward! What's changing technically? Inside SR Agentic, Venice now runs as a VLM (Vision-Language Model) reasoning engine. Visual understanding in complex environments, multi-step reasoning, and natural language reporting all pass through Venice. Thanks to the OpenAI-compatible API, integration is clean and fast, and the edge loop remains in place. The two layers don't do each other's work; each performs its own task. This partnership is particularly critical for $SR. Venice's privacy infrastructure + Strike Robot's autonomous systems coming together is not just a technical integration, but redefines how robotic AI is trained.
Frank tweet media
English
0
1
11
535
Ben
Ben@shinhan2000·
I remember that when I competed in the national science research competition, I also researched this quadruped robot topic. Back then, I focused on the range of motion and trajectory of the robot's legs through forward and inverse kinematics. Today, the development of artificial intelligence has enabled robots to move across various terrains and easily adapt to complex environments. The development of @StrikeRobot_ai has also progressed from basic foundations to more complex technologies such as VLM and World Modeling.
真広(まひろ)@med_mahiro

KX-01(4.5kg小型犬ロボ)。 Mujinaを参考に微調整&再学習しておいたので、凹凸も追加してSim-to-Simで確認(mujoco_ros2_control)。 Sim-to-Realに向けて、ゲームパッド操作とトリム調整にも対応。 前回より軽快さが増した気がする。

English
1
3
4
79
Ben
Ben@shinhan2000·
That's point!! world model
Trueman (CHU Meng)@truemanv5666

We recently released a survey / position paper on Agentic AI × World Models: Agentic World Modeling: Foundations, Capabilities, Laws, and Beyond The paper has received encouraging attention since its release, with related pages now surpassing 100K+ views. Thank you all for the discussions, feedback, and reposts especially for @dotey @_akhaliq @omarsar0 . We propose a Levels × Laws framework: Levels: L1 Predictor → L2 Simulator → L3 Evolver Laws: Physical / Digital / Social / Scientific World The paper surveys 400+ related works and summarizes 100+ representative systems, spanning model-based RL, video generation, Web/GUI agents, multi-agent systems, robotics, and AI for Science. We hope this work can provide a clearer conceptual map for the growing discussions around Agentic AI and World Models. GitHub: github.com/matrix-agent/a… (Thanks for your stars ⭐️ we will update it frequently) Paper: arxiv.org/abs/2604.22748 Homepage: agentic-world-modeling.xyz Feedback, critiques, and reposts are very welcome.

English
3
2
8
107
Ben retweetledi
3DMax
3DMax@3DMax_Virtuals·
Team keeps updating GitHub — they’re cooking something with the $SR platform. Once again, I’ll say this: Did you know that $VVV, now at a $1.34B valuation and one of the best-performing AI projects on Base, chose @StrikeRobot_ai as its first robotics partner? (i can't believe why $SR so low) $POD is going parabolic. $cyb3rwr3n hit ATH. Next: $SR to $20M market cap. github.com/orgs/StrikeRob…
English
4
4
46
2.5K
Ben
Ben@shinhan2000·
The evolution of StrikeRobot: VLM + Action model -> VLA -> ARLMA? Robots can learn on their own through other models. Humans don't need to define reward functions for the robots; the robots will find their own reward functions.
Base APAC@baseapac

Finalist –– @StrikeRobot_ai A modular cognitive awareness system for autonomous robots. StrikeRobot enables robots to process environmental data and operate across different hardware and real-world environments.

English
3
3
8
202
Ben retweetledi
Ben retweetledi
Eren Chen
Eren Chen@ErenChenAI·
Why Robots work in Simulation but fail in Reality One of the most frustrating moments in robotics: Everything works perfectly in simulation. Then you deploy it on a real robot and suddenly: The grasp misses The arm shakes The robot drifts Contact becomes unstable The motion looks correct, but the task still fails How do you solve the sim to real problem? At first, it sounds simple. Just move the code from simulation onto hardware. But the gap between simulation and reality is much larger than most people think. Simulation environments are extremely clean. The table is flat. Object geometry is accurate. Friction is predefined. Sensors are stable. Robot joints behave exactly as expected. But the real world is messy. Lighting changes. Depth sensors drift. Objects reflect light differently. Motors have delay. Joints have backlash. Contact forces behave unpredictably. And robotics is a chain reaction. A small perception error becomes a planning error. The planning error becomes a control error. The control error becomes an execution error. Eventually, the robot misses the grasp by a few centimeters and the entire task fails. And The hardest part is usually contact. Humans think tasks like: grasping a cup, opening a door, inserting an object, pushing a box are trivial. For robots, these are extremely difficult because contact is not clean physics. A tiny shift in friction, force, or surface geometry can completely change the result. In simulation, objects are usually “well behaved.” In reality: objects slip contact points shift surfaces deform collisions happen unexpectedly This is why many robotic tasks fail not because the policy is fundamentally wrong, but because reality itself introduces uncertainty. Sensors are also less reliable than people think. The robot’s perception already contains error: camera noise unstable depth estimation occlusion pose estimation drift changing lighting conditions Sometimes the model itself is fine, but the input is already slightly wrong. By the time the error propagates to the end effector, the grasp fails. The robot hardware itself is also imperfect. Motors have latency. Controllers have frequency limits. Actuators have error. Different loads change behavior. In simulation, the robot follows commands perfectly. In reality, it may move slightly slower, slightly off target, or slightly unstable. Those tiny differences are fatal in robotics because robots physically interact with the world. Sim2Real being difficult does not mean simulation is useless. Simulation is still incredibly valuable: they are cheap, safe, scalable and reproducible. A better way to think about simulation is: Simulation is the training ground, not the final battlefield. Modern Sim2Real methods usually combine multiple approaches: making simulation more realistic, adding domain randomization, randomizing lighting, friction, object positions, and sensor noise, fine-tuning with real-world data. The goal is not to make the robot adapt to one perfect virtual world. The goal is to make the robot robust enough to survive an imperfect real one. The most important lesson in robotics is: Success in simulation is only the first step. The real test begins when the robot touches the real world. Video Credit: Kevin Zakka
English
11
40
277
24.3K
Ben retweetledi
Strike Robot
Strike Robot@StrikeRobot_ai·
1,000 MuJoCo scenes. 46k+ STL meshes. Focus on Unitree G1. Now open-source on @huggingface. Sim-to-real fails when training environments lack diversity. Hand-authoring scenes doesn't scale. So we contributed 1,000 of them — varied geometry, varied object placement, all physics-validated, all MJCF-ready in one line of code. SR Platform is the agentic system behind it: natural-language prompt → structured scene plan → cached/generated CadQuery assets → validated layout → executable MuJoCo scene.
Strike Robot tweet media
English
16
12
72
8.8K
Ben retweetledi
Venice
Venice@AskVenice·
Venice is powering @StrikeRobot_ai's private robotics stack Our step into robotics is built on the same principle behind everything we do: Your AI shouldn't spy on you.
Strike Robot@StrikeRobot_ai

IT'S OFFICIAL! We're announcing our partnership with @AskVenice — a privacy-first, uncensored AI inference platform on @base — as a primary inference API backend for StrikeRobot’s product offerings. The partnership involves Venice's credit sponsorship and co-development with StrikeRobot to become the VLM reasoning and inference layer for robots, starting with SR Agentic and SR Platform. Robotic anonymous training is here!

English
42
81
491
51.7K
Ben retweetledi
Strike Robot
Strike Robot@StrikeRobot_ai·
IT'S OFFICIAL! We're announcing our partnership with @AskVenice — a privacy-first, uncensored AI inference platform on @base — as a primary inference API backend for StrikeRobot’s product offerings. The partnership involves Venice's credit sponsorship and co-development with StrikeRobot to become the VLM reasoning and inference layer for robots, starting with SR Agentic and SR Platform. Robotic anonymous training is here!
English
65
89
498
119.6K
Ben
Ben@shinhan2000·
ZXX
0
0
2
28
Ben retweetledi
Strike Robot
Strike Robot@StrikeRobot_ai·
Strike Robot is now Top 4 on Datanet Volume Traded on @reppo 🔥 After ~1 month of going public: → ~14M volume traded → 10.1% network share → 13.59M total votes SR Platform – Strike Robot is among the leading datanets, building alongside top contributors in the Reppo ecosystem. Explore more: reppostats.com/analytics
Strike Robot tweet media
English
49
26
154
36.7K
Ben retweetledi
Nyx
Nyx@tienho_nyx·
StrikeRobot is one of the projects I’ve been following and involved in quite early on and up to now, it’s still building along a very clear direction: AI + Robotics + Web3 @StrikeRobot_ai At its core, the project focuses on: Autonomous AI agents Direct integration with robotics systems Infrastructure that brings AI from simulation into the real world So it’s not just AI “on screen” anymore it’s AI that can act in the physical world. What keeps me with StrikeRobot isn’t short-term hype, but the direction they’re taking. While most AI projects today are still focused on: models automation or data layers StrikeRobot is tackling a much harder problem: → embodied AI Personally, I see this as the next step for AI. When AI starts to: control robots participate in production / logistics and create real-world value then the narrative shifts beyond just “AI tools” I’ve been following this project for a while, and a few things stand out to me: A relatively consistent long-term vision Strong alignment with major narratives: AI agents + machine economy Building within the Virtuals ecosystem, leveraging attention and the creator layer From a long-term perspective, my view remains the same: 👉 AI won’t stay purely digital 👉 And projects like StrikeRobot are already one step ahead in that direction
English
93
1
92
1.6K
Ben retweetledi
Strike Robot
Strike Robot@StrikeRobot_ai·
From seeing → to understanding → to following → now to finding. In the last update, our robot learned how to recognize and follow people SR Agentic now evolves from human-following → to goal-driven object search. Give it a task like: → “find the white bottle on the table ” It will: • break down the instruction via LLM task planning • scan the environment in real-time • localize the object in 3D space • navigate and approach autonomously Step by step, capability by capability — from perception → to tracking → to task execution in the physical world. This is how real-world agentic robotics is built
Strike Robot@StrikeRobot_ai

From seeing → to understanding → now to following. In the previous video, we showed how the robot sees and navigates the world. This time, we’ve upgraded SR Agentic — it can now identify a designated person and autonomously follow them in real-time. Powered by perception + reasoning + adaptive navigation, the robot doesn’t just react - it tracks intent and moves with purpose.

English
53
17
119
21.8K
Ben retweetledi
Office2Crypto
Office2Crypto@office2crypto·
The much awaited and teased launch of $SR by @StrikeRobot_ai is now live on @virtuals_io Sitting at $3.6M FDV with alot of factors that can push the price forward This is the 2nd robotics launch under the new robotics mechanism Fully positioned in virtuals robotics I am ready to get hurt again 🫡
Office2Crypto tweet media
Strike Robot@StrikeRobot_ai

Humanoid robotics is the next great labor shift — and the real opportunity is not in the hardware. Strike Robot is now live on @virtuals_io — we’re building SR Agentic, a plug-and-play agentic framework for robotics, starting with Unitree G1, enabling autonomous anomaly detection, safety monitoring, and continuous operation in complex environments. AI makes robots intelligent. $SR makes them investable. Every robot. Its own right. Our journey starts today.

English
0
1
14
1.2K
Ben retweetledi
Virtuals Protocol
Virtuals Protocol@virtuals_io·
Virtuals is for plug-and-play robotics. Any Unitree G1 can run anomaly detection, safety monitoring, continuous ops right out of the box.
Strike Robot@StrikeRobot_ai

Humanoid robotics is the next great labor shift — and the real opportunity is not in the hardware. Strike Robot is now live on @virtuals_io — we’re building SR Agentic, a plug-and-play agentic framework for robotics, starting with Unitree G1, enabling autonomous anomaly detection, safety monitoring, and continuous operation in complex environments. AI makes robots intelligent. $SR makes them investable. Every robot. Its own right. Our journey starts today.

English
32
43
306
27.7K