RoboMove
304 posts

RoboMove
@RoboMove
signal detected—stand by. awaiting input_▊



This is the chaos produced by a triple pendulum with just 3 degrees of freedom. Your hand has 27: 4 per finger + 5 for the thumb + 6 for the wrist Building a robot to conduct this orchestra of joints and tendons to pick up a coin is ... non-trivial.


Launch of RoboMove One month ago, we announced @RoboMove, initially a basic demo to showcase the CodecFlow tech stack. It has since evolved into a standalone mini product. Here's a concise overview Built on CodecFlow: Fully leverages the @codecopenflow tech stack for robust performance. Advanced Robotics: Enables users to explore and test the latest robotics advancements. Interactive Simulations: Users can submit their own simulations and interact with others using the optr SDK and visual editor. Features of @RoboMove: Creator Hosting: Submit your sim, and we’ll run it on our live stream. With a multitude of use cases, such as earning royalties for interactions, since we see creator economies only accelerating in this digital era. Tweet → Attempt → Clip Loop: Tweet a command, watch the robot attempt it in realtime, and get an auto-generated reply video clipped and shared. Launch Template: Kickstarting @RoboMove, with the Unitree G1, giving a baseline locomotion model to interact with and customize. Command Understanding: @RoboMove parse natural-language tweets into task graphs and skills using advanced AI tools. Before execution, commands are validated, ensuring safety and fallback behaviors if things go awry. Who This For? Researchers & teams who want to transform static "results videos" into interactive, repeatable demos. Share proofs of your breakthroughs, collaborate, and monetize via gated access. Indie builders who want to ship a sim, gather users, data, and feedback in days. Allowing you to decrease the time it takes for data collection.



Earlier this year, we led an investment consortium that invested $120M into @Apptronik This is our thesis on Humanoids and Apptronik mechanism.capital/writings/our-i…

Designing dexterous hands for robots is a very hard problem. Then figuring out how to manufacture them at scale is 100 times harder.

Launch of RoboMove One month ago, we announced @RoboMove, initially a basic demo to showcase the CodecFlow tech stack. It has since evolved into a standalone mini product. Here's a concise overview Built on CodecFlow: Fully leverages the @codecopenflow tech stack for robust performance. Advanced Robotics: Enables users to explore and test the latest robotics advancements. Interactive Simulations: Users can submit their own simulations and interact with others using the optr SDK and visual editor. Features of @RoboMove: Creator Hosting: Submit your sim, and we’ll run it on our live stream. With a multitude of use cases, such as earning royalties for interactions, since we see creator economies only accelerating in this digital era. Tweet → Attempt → Clip Loop: Tweet a command, watch the robot attempt it in realtime, and get an auto-generated reply video clipped and shared. Launch Template: Kickstarting @RoboMove, with the Unitree G1, giving a baseline locomotion model to interact with and customize. Command Understanding: @RoboMove parse natural-language tweets into task graphs and skills using advanced AI tools. Before execution, commands are validated, ensuring safety and fallback behaviors if things go awry. Who This For? Researchers & teams who want to transform static "results videos" into interactive, repeatable demos. Share proofs of your breakthroughs, collaborate, and monetize via gated access. Indie builders who want to ship a sim, gather users, data, and feedback in days. Allowing you to decrease the time it takes for data collection.














