Chris

2 posts

Chris banner
Chris

Chris

@chrisemower

Senior research scientist at Huawei's Noahs Ark Lab.

London, England Katılım Haziran 2017
10 Takip Edilen6 Takipçiler
Chris
Chris@chrisemower·
@NivaPlatforms @hbouammar hi, no it doesn't infer the shape of apple, and it doesn't really "practice", we use several previous experiences, and select the most relevant to help improve the performance
English
1
0
2
22
Todd Peterson
Todd Peterson@NivaPlatforms·
@hbouammar Does it actually infer the shape of the apple despite being occluded. How many time does it need to practice?
English
1
0
1
162
Haitham Bou Ammar
Haitham Bou Ammar@hbouammar·
The next breakthrough for robot VLMs isn’t a bigger model 🧠 it’s memory. RAL accepted ✅ We show that a robot can ground a VLM using its own real-world experience, not extra training. EXPTEACH/PRAGMABOT = self-generated memory + closed-loop reflection: Fail → diagnose → replan → succeed → store experience Next time → retrieve the right experience → succeed in one shot Results on real robot tasks: 36% → 84% with reflection (STM) 22% → 80% single-trial success with long-term memory (LTM+RAG) And yes, we saw tool use emerge without explicitly asking for it. If you’re fighting brittle “VLM plans” on hardware, this is for you. arxiv.org/pdf/2507.16713 #AI #Robotics @anybotics
English
7
56
377
12K
Chris retweetledi
Haitham Bou Ammar
Haitham Bou Ammar@hbouammar·
There is a new robotic revolution in terms of hardware! Yet, robotics software stagnated at ROS for some reason. While ROS is what people are using, it presents a significant hurdle for many of us in the ML field to get started (interfacing with the hardware, C/C++ 🤯 ). Well, we want to change that! Welcome to the new generation of robotics software, based on Python! Yep, Python! So you can pip install robotics!! The paper is coming to arxiv, but as usual, my LinkedIn followers get a first view first, so have a read through the paper (robotics-ark.github.io/ark_robotics.g…)! We are opening it step by step to allow feedback and engagement, please star our repos: github.com/orgs/Robotics-… 🚀 Meet Ark – a Python-first, open-source framework that finally lets ML engineers and roboticists speak the same language. One pip install, no labyrinthine C++ build chains. 🔄 One flag = sim-to-real. Toggle sim: True/False in a YAML file and run the exact same policy on PyBullet/MuJoCo or physical hardware – no code rewrites, no ROS launch-file plumbing. 🧩 Node-based, LCM pub-sub architecture means sensors, policies, and controllers are hot-swappable processes. Hack, crash, or iterate without taking the whole stack down. 📦 Built-in data & debugging tools – LCM logger/player, real-time plots, graph viewer, and camera streams – slash time spent on “why is my topic empty?” detective work. 🏋️ Out-of-the-box imitation-learning pipelines (Diffusion Policy, ACT) with reusable data-collection nodes (VR, gamepad, kinesthetic). RL integration is next on the roadmap. 🤝 Interoperates, doesn’t dictate – optional ROS bridge, clean C/C++ bindings for real-time loops, and a backend API that can host any simulator or custom driver. 🗺️ SLAM → Planning → Control modules included: Fast-SLAM mapping, A* path planning with safety margins, PD waypoint tracking – demoed on a Husky in a kitchen maze. 🤖 Case studies: ViperX pick-&-place, A1 humanoid cloth folding, language-conditioned board-game play via DeepSeek-R1, mobile inspection – all reproducible from provided configs. 👉 Call to action: Check out the repo, try Ark on your lab robot this week, and let us know what you build! #AI #Machinelearning #Robotics #roboticsurgery #ycombinator #Robot
Haitham Bou Ammar tweet mediaHaitham Bou Ammar tweet media
English
1
2
19
51.1K