Sai Kishor Kothakota

1.1K posts

Sai Kishor Kothakota banner
Sai Kishor Kothakota

Sai Kishor Kothakota

@manforrobots

Sr. Robotics Engineer @PALRobotics. I am a keen Robotics Enthusiast, who gets motivated by robots and their evolution.

Barcelona, Spain Katılım Eylül 2016
422 Takip Edilen310 Takipçiler
Sabitlenmiş Tweet
Sai Kishor Kothakota
Sai Kishor Kothakota@manforrobots·
Go @OlympusMonsTeam! Go @PALRobotics!💪🏽💪🏽 2 years of hardwork turned to a dream come true!! Very proud to be part of the team!
Space Center Houston@SpaceCenterHou

Congratulations to the winners of the @NASAPrize Space Robotics Challenge Phase 2, presented by BHP! These teams have been working since 2019 to help develop code for NASA’s next generation of space robots. Visit the link to learn more! bit.ly/2H4xbyO

English
1
0
5
0
Sai Kishor Kothakota retweetledi
Chen Tessler
Chen Tessler@ChenTessler·
Animation 🤝 Robotics ProtoMotions GTC 2026 release — bridging the gap between digital humans and real humanoid robots. Train in simulation. Deploy on hardware. One framework, one codebase. nvlabs.github.io/ProtoMotions
English
5
36
124
6.7K
Sai Kishor Kothakota retweetledi
Bones Studio
Bones Studio@TheBonesStudio·
Training humanoid robots? You need motion data. Real, high-fidelity, human motion data. And until now - there was no open dataset purpose-built for humanoid robotics. For 5 years, we've been building the largest enterprise-grade human motion and behavior datasets for embodied AI. Our data powered breakthrough SONIC research. Today, at GTC, with @NVIDIARobotics, we're opening a piece of it to the world. BONES-SEED: → 142,200 motion capture animations → Up to 6 natural language descriptions per motion → Temporal segmentation of every action → Curated for humanoid robotics → In NVIDIA SOMA and Unitree G1 (MuJoCo) formats From text to action. Now yours. Go build → bones.studio/datasets/seed #NVIDIAGTC
English
5
43
172
27.1K
Sai Kishor Kothakota retweetledi
Kevin Zakka
Kevin Zakka@kevin_zakka·
Coming soon to mjlab: heterogeneous worlds, aka every world gets its own object 👀
English
13
25
280
18.8K
Eren Chen
Eren Chen@ErenChenAI·
Raise your hand if you would like this cute Booster K1 figure 🤚 🤚
Eren Chen tweet media
English
17
3
39
2.7K
Sai Kishor Kothakota retweetledi
Sai Kishor Kothakota
Sai Kishor Kothakota@manforrobots·
@kevin_zakka If the user has it enabled, it will automatically review his/her commits Configuring automatic code review by GitHub Copilot - GitHub Docs #configuring-automatic-code-review-for-your-own-pull-requests" target="_blank" rel="nofollow noopener">docs.github.com/en/copilot/how… You can find more information here on how to disable it for mjlab, if It is enabled
English
0
0
2
167
Kevin Zakka
Kevin Zakka@kevin_zakka·
Does anyone know how to stop this? I never approved or allowed automatic reviewing...
Kevin Zakka tweet media
English
6
0
16
3.8K
Sai Kishor Kothakota
Sai Kishor Kothakota@manforrobots·
Huge shoutout to @LouisLeLay4 for his stellar work during his internship at PAL! 🌟 We’ve achieved some great results, and while there’s always more to build, the momentum is real 📈. Stay tuned! 👀🚀 Thank you for all the time and dedication @LouisLeLay4
Louis Le Lay@LouisLeLay4

Open-sourcing pal_mjlab -- the RL work I did at PAL Robotics. Velocity tracking, motion imitation, and dual-arm reaching across KANGAROO, TALOS, and TIAGo Pro. Multiple tasks transferred to real hardware, all built on @kevin_zakka's mjlab.

English
1
1
12
1K
Kevin Zakka
Kevin Zakka@kevin_zakka·
Happy Friday!! mjlab v1.2.0 is out. This is our biggest release yet with 60+ PRs from 12 contributors. pip install mjlab Some highlights include: - New more powerful domain randomization module - Revamped ergonomic viewers - Cloud training via @SkyPilot - Complete doc rewrite
English
7
14
114
14.9K
Sai Kishor Kothakota retweetledi
Kevin Zakka
Kevin Zakka@kevin_zakka·
New feature coming to mjlab: "step" events that fire every simulation tick. First use case for this event is implementing random force impulses applied to bodies with configurable duration + cooldown. It works alongside mouse perturbations in the native viewer 🥊
English
4
4
91
6K
Sai Kishor Kothakota retweetledi
Kevin Zakka
Kevin Zakka@kevin_zakka·
The viser viewer in mjlab just got a huge QOL upgrade! - Real-time factor control: go slower or faster than real-time and viewer paces physics to match - Single step mode: advance one physics step at a time (super useful for debugging!) - Overall faster and smoother
English
5
26
179
7.1K
Davide Faconti
Davide Faconti@facontidavide·
They at least deserve a "thanks". Claude Code Max free for open source developers 😁
Davide Faconti tweet media
English
1
0
22
1.2K
Sai Kishor Kothakota retweetledi
Siyuan Huang
Siyuan Huang@siyuanhuang95·
You might have seen the WuBOT performing at the 2026 Spring Festival Gala; however, most high-dynamic extreme motions you see are executed by overfitted tracking policies. Until now, training a unified policy capable of performing various extreme motions with a high success rate remained an unsolved challenge. We spent an entire year digging into the barrier between general tracking and extreme physical behaviors. After burning through dozens of G1 robots, we finally identified the bottleneck of learning and physical executability. With these discoveries, we developed OmniXtreme: the first general policy that can execute diverse extreme motions, including consecutive flips, extreme balancing, and even breakdancing with rapid contact switches! This capability is achieved by pre-training a flow-based generative control policy and then post-training with actuation-aware residual RL for complex physical dynamics—a step we found critical for successful real-world transfer. This work is a joint collaboration with @UnitreeRobotics. Together, we are pushing the physical limits of humanoid robots. It is incredibly exciting to see a general "robot gymnast" and "robot breakdancer" come to life! It was also our first time publishing a paper with XingXing, which was an enlightening experience. The model checkpoints are now released—we welcome you to play with them! 📦 📄 Paper: arxiv.org/abs/2602.23843 🌐 Project: extreme-humanoid.github.io 💻 Code: github.com/Perkins729/Omn…
English
31
141
724
88.5K
Kevin Zakka
Kevin Zakka@kevin_zakka·
mjlab now supports cloud training via SkyPilot. One command launches a GPU instance, syncs your code, trains, and tears down when done. We support 2 modes: direct uv install and Docker. Multi-GPU and hyperparameter sweeps work out of the box. mujocolab.github.io/mjlab/main/sou…
English
3
12
87
12.1K
Sai Kishor Kothakota
Sai Kishor Kothakota@manforrobots·
High-fidelity simulation meets the ROS controls ecosystem! 🤖✨ I’m excited to introduce mujoco_ros2_control, a professional-grade bridge bringing the elite speed and contact stability of MuJoCo to #ROS2. It’s built for researchers and engineers who need something better..
GIF
English
6
32
197
11.2K
Sai Kishor Kothakota
Sai Kishor Kothakota@manforrobots·
Initially developed at NASA iMETRO for advanced robot hardware, this project has moved to the ROS Controls organization. Huge thanks to the team: Erik Holum, Nathan Dunkelberger, and our amazing community contributors.
English
0
0
4
240
Sai Kishor Kothakota
Sai Kishor Kothakota@manforrobots·
Simulate complex mechanical behaviors accurately: 🔹 Transmissions: Full support for mechanical transmissions 🔹 Mimic Joints: Specialized handling for grippers and linkages via tendons and equality constraints. 🔹 Floating Base: Built-in odometry for mobile and humanoid robots.
English
0
0
1
249