Armsentientrobotics

15 posts

Armsentientrobotics banner
Armsentientrobotics

Armsentientrobotics

@armsentient

Open-source desktop 12-DOF quadruped robot with arm/gripper, featuring intelligent servo control and voice recognition

Katılım Nisan 2026
0 Takip Edilen3 Takipçiler
Armsentientrobotics
Armsentientrobotics@armsentient·
this is how the robot learns to walk without anyone teaching it. three observation inputs feed into the network: robot state vector (joint positions, velocities), IMU data (orientation, angular rates), and camera feed as raw image tensors. the robot sees, feels, and knows where it is. encoder layers compress all that into a latent representation. hidden layers process it through the PPO/SAC policy network. policy output generates joint torques, an action vector that tells all 12 servos exactly how much force to apply. but that's only half the loop. joint torques go into MuJoCo simulation. MuJoCo computes physics, returns new state. reward computation scores the action: did you move forward? waste energy? stay stable? minimize jerk? that reward signal flows back through gradient descent optimization into both the policy network and the value network. the value network estimates how good a state is. the policy network decides what to do. they train together, millions of steps, until the robot walks like it was born knowing how. then we transfer the policy to real hardware.
Armsentientrobotics tweet media
English
0
0
0
51
Armsentientrobotics
Armsentientrobotics@armsentient·
the full communication stack of a robot that talks to the cloud, learns from GPU servers, and fits on your desk. Robot (ESP32) connects to home WiFi. from there it reaches Claude Cloud API for cognitive reasoning, natural language commands, and mission planning. Claude doesn't run on the robot, it runs in the cloud and sends strategic decisions back. GPU training server sits on the other end. RL policies trained on massive parallel sim environments, results pushed to cloud storage for logs, then deployed back to the robot. the robot gets smarter without needing a GPU strapped to its back. control it from your phone. two mobile app dashboards connected over WiFi for teleoperation and monitoring. or go direct with BLE remote control, no internet needed, just point and drive. three communication paths: WiFi for cloud AI + training, BLE for local direct control, cloud storage for persistent logging. redundancy built in. a desktop quadruped with the same comms architecture as robots 100x its price.
Armsentientrobotics tweet media
English
0
0
0
43
Armsentientrobotics
Armsentientrobotics@armsentient·
OBS Bugging. MuJoCo decided to have a seizure mid-recording. sorry guys, sim environment threw a bug right when we were capturing the training pipeline footage. typical robotics moment, nothing works when the camera is on. gonna fix it and try again. the robot doesn't care about your recording schedule. stay tuned.
English
1
0
1
61
Armsentientrobotics
Armsentientrobotics@armsentient·
8 steps from a 3D printer to an autonomous quadruped robot. we made the assembly guide so you can build one too. print chassis parts. PLA+, clean supports, done. install servos. 12× MG92R mounted into the frame, orientation matters. wire PCA9685. servo signal lines, power, I2C comms, all connected. mount STM32. standoffs and screws onto the central platform. connect IMU. MPU6050 over I2C for orientation and motion tracking. add voice module. LD3320 for speech recognition, mic and speaker wired. install battery. LiPo in, verify polarity before you fry everything. calibrate. plug into a computer, set servo center offsets and motor limits. that's it. from filament to a walking robot with voice control and AI cognition. every STL file, every firmware hex, every schematic is in the repo. open-source means you don't just look at it, you build it.
Armsentientrobotics@armsentient

6 minutes of what 4 months of building a quadruped robot actually looks like. started in SolidWorks. every bracket, every servo mount, every joint clearance designed from scratch. the blue panel you see? that's the main chassis plate, CNC-ready, holding 12 servos, an STM32 brain, IMU, and a full robotic arm. this isn't "i forked a repo and added a README." this is PCB schematics → CAD modeling → 3D printing → firmware flashing → gait tuning → voice integration → RL training pipeline. leg brackets designed around MG92R servo dimensions. arm linkage with gripper clearance calculated for pickup sequences. mounting holes for PCA9685 servo driver and MPU6050 IMU. power routing separated between MCU and servo rails because 12 servos pulling 5A will brown out your controller faster than you can debug it. 4 months. 32 commits. hardware committed before a single line of hype was written. the robot walks. it grabs. it listens. it thinks. and now it has a token. $ASR github.com/ohmyzaid/ArmSe…

English
1
0
0
78
Armsentientrobotics
Armsentientrobotics@armsentient·
4 months. 4 phases. from CAD files to a walking, thinking robot. Month 1, Hardware: design finalization in SolidWorks, component sourcing, prototype assembly and initial testing. this is where the PCB Rev 1.0 and 3D-printed chassis came to life. Month 2, RL Training: simulation environment setup in Isaac Gym, reward function design, policy optimization, model training and validation in sim. hundreds of virtual quadrupeds learning to walk before the real one takes a single step. Month 3, Claude AI Integration: API connection, interface development, contextual understanding and command processing, language model fine-tuning and testing. giving the robot a brain that reasons in natural language. Month 4, System Integration & Testing: hardware-software unification, real-world environment trials, debugging, final system validation and deployment readiness. then launch. most projects skip straight to Month 4 and wonder why nothing works. we did it in order. git history proves it.
Armsentientrobotics tweet media
English
2
0
1
64
Armsentientrobotics
Armsentientrobotics@armsentient·
custom PCB and full power distribution schematic. this is what "we build hardware" actually means. image 1: Robot Controller PCB Rev 1.0. STM32F103C8 center-mounted. PCA9685 PWM driver up top with 16 servo channels. MPU6050 IMU on the right for orientation. LD3320 voice module on the right. dual voltage regulators, 3.3V and 5V, sitting side by side. USB-C for debug and power. SWD header for flashing firmware. every trace routed, every pad placed. image 2: the power system nobody wants to design but everyone needs. 3S LiPo (11.1V) comes in with reverse polarity protection (P-MOSFET + diodes), overcurrent protection (PTC fuse), and undervoltage protection (comparator circuit with MOSFET cutoff at 10V). from there it splits: LM2596 buck converter steps down to 5V system power. AMS1117-3.3 LDO regulates down to 3.3V logic for STM32. separate high-current 6V servo regulator powers all servos independently. TP4056 charging module for the LiPo. TXB0104 logic level shifters bridging 3.3V MCU signals to 6V servo domain. this isn't a breadboard prototype. this is Rev 1.0 engineering-grade design with protection circuits that prevent your robot from catching fire. 4 months. custom PCB. custom power distribution. open-source.
Armsentientrobotics tweet mediaArmsentientrobotics tweet media
Armsentientrobotics@armsentient

this is the wiring diagram of a robot that walks, grabs, and listens to your voice. STM32F103C8 at the center running the whole show. left side: MPU6050 IMU feeding accelerometer and gyro data so the robot knows which way is up. right side: LD3320 voice module so you can talk to it and it actually responds. below the brain: PCA9685 servo driver on I2C, splitting one data line into 16 independent PWM channels. 12 of those channels driving MG92R high-torque servos, 3 per leg, 4 legs. the remaining channels running the gripper/arm mechanism on the far right. power: LiPo battery with separated rails. servo power goes straight to PCA9685's V+ line. logic power goes to STM32. you cross those rails and 12 servos stalling simultaneously will kill your microcontroller in milliseconds. every wire on this diagram exists on a real PCB. schematics committed to GitHub 4 months ago. this isn't a fritzing sketch someone made in an afternoon. this is production wiring for a robot that's already walking.

English
1
0
1
115
Armsentientrobotics
Armsentientrobotics@armsentient·
zooming into the brain of ArmSentientRobot. STM32F103C8 at the center. ARM Cortex-M3 core running at 72MHz with flash memory, SRAM, DMA controller, advanced timers, I2C and USART interfaces all on one chip. left side over I2C: PCA9685 PWM driver controlling all 12 servos + gripper. right side over I2C: MPU6050 IMU feeding orientation data back into the loop. below the MCU, the software layer: PID control loops keeping servos stable, IK solver algorithms computing joint angles from foot targets in real-time. these aren't running on a laptop. they're running on a $2 microcontroller at 50Hz. comms over USART: LD3320 voice recognition module for spoken commands. ESP32 BLE module for wireless control from your phone. DMA controller means sensor reads don't block the main loop. advanced timers mean precise PWM timing without CPU overhead. this is how you run inverse kinematics + PID + sensor fusion on a chip with 20KB of RAM. every block on this diagram maps to real firmware in the repo.
Armsentientrobotics tweet media
English
0
0
0
47
Armsentientrobotics
Armsentientrobotics@armsentient·
this is the wiring diagram of a robot that walks, grabs, and listens to your voice. STM32F103C8 at the center running the whole show. left side: MPU6050 IMU feeding accelerometer and gyro data so the robot knows which way is up. right side: LD3320 voice module so you can talk to it and it actually responds. below the brain: PCA9685 servo driver on I2C, splitting one data line into 16 independent PWM channels. 12 of those channels driving MG92R high-torque servos, 3 per leg, 4 legs. the remaining channels running the gripper/arm mechanism on the far right. power: LiPo battery with separated rails. servo power goes straight to PCA9685's V+ line. logic power goes to STM32. you cross those rails and 12 servos stalling simultaneously will kill your microcontroller in milliseconds. every wire on this diagram exists on a real PCB. schematics committed to GitHub 4 months ago. this isn't a fritzing sketch someone made in an afternoon. this is production wiring for a robot that's already walking.
Armsentientrobotics tweet media
English
0
0
1
160
Armsentientrobotics
Armsentientrobotics@armsentient·
four concentric rings. one robot brain. outside ring: Claude AI handles contextual understanding and high-level planning. you talk to it, it reasons about the mission. second ring: reinforcement learning layer. training environments, PPO policies, policy optimization. the robot doesn't follow scripts, it learns how to move. third ring: control layer. STM32 microcontroller running inverse kinematics and PID controllers in real-time. math turns intent into motion. core: the hardware itself. servos, IMU, LiDAR, cameras, force sensors, power management. metal, wire, and purpose. and it doesn't stop at the robot. cloud services handle data storage, updates, and analytics. mobile app gives you teleoperation and a user interface from your phone. this is the full loop. cognition wraps around learning wraps around control wraps around hardware. every layer talks to every adjacent layer. bidirectional arrows everywhere because real autonomy isn't top-down, it's a conversation between thinking and doing. 4 months of building this stack. open-source.
Armsentientrobotics tweet media
English
0
0
1
59
Armsentientrobotics
Armsentientrobotics@armsentient·
this is what happens when you give a quadruped robot a brain stack. 4 layers. from cloud cognition to metal on the ground. Claude API processes natural language → reasoning engine plans strategy → RL policies translate goals into adaptive gaits → STM32 runs IK + PID at 50Hz → 12 servos move, sensors feed back up the chain. the robot doesn't just execute commands. it reasons, learns, adapts, and acts. open-source. every layer. Cognitive Layer: Claude API cloud service. NLP intent parsing → reasoning engine → task planning → knowledge base + external data API. outputs strategic commands/goals. Learning Layer: RL policies (actor-critic). policy network + value network + training buffer + reward calculation + state estimation + action selection. PPO/SAC. outputs high-level action commands. Control Layer: STM32F103C8. motor drivers → sensor interfaces (ADC, SPI, I2C, UART) → RTOS → kinematics engine → PID control loops → UART/CAN comms. 50Hz servo loop. Hardware Layer: quadruped body + 4× leg actuators + 3-DOF robotic arm + gripper + IMU + LiDAR + camera + joint encoders + force sensors (feet) + battery pack. bidirectional data flow: sensor data (raw) flows up, actuator commands (PWM) flow down. state/task data bridges cognitive ↔ learning layers. color-coded: green (cognitive), orange (learning), deep blue (control), cyan (sensor/state).
Armsentientrobotics tweet media
English
1
0
2
86
Armsentientrobotics
Armsentientrobotics@armsentient·
6 minutes of what 4 months of building a quadruped robot actually looks like. started in SolidWorks. every bracket, every servo mount, every joint clearance designed from scratch. the blue panel you see? that's the main chassis plate, CNC-ready, holding 12 servos, an STM32 brain, IMU, and a full robotic arm. this isn't "i forked a repo and added a README." this is PCB schematics → CAD modeling → 3D printing → firmware flashing → gait tuning → voice integration → RL training pipeline. leg brackets designed around MG92R servo dimensions. arm linkage with gripper clearance calculated for pickup sequences. mounting holes for PCA9685 servo driver and MPU6050 IMU. power routing separated between MCU and servo rails because 12 servos pulling 5A will brown out your controller faster than you can debug it. 4 months. 32 commits. hardware committed before a single line of hype was written. the robot walks. it grabs. it listens. it thinks. and now it has a token. $ASR github.com/ohmyzaid/ArmSe…
English
1
0
1
155
Armsentientrobotics
Armsentientrobotics@armsentient·
4 months of commits. hardware → firmware → voice → AI pipeline. commit timeline: PCB schematics + CAD assets → 4mo ago arm control firmware + voice relay → 2mo ago LD3320 voice recognition → 3wk ago core cleanup + architecture docs → 4 days ago 32 commits. real development cadence. not a README-only repo. $ASR
Armsentientrobotics tweet media
English
0
0
1
97
Armsentientrobotics
Armsentientrobotics@armsentient·
open-sourced a 12-DOF quadruped with a robotic arm, voice control, and Claude AI cognitive planning it walks. it grabs. it thinks. CA : Fzcsu3cQcpF84rVwJe9h2fouz9WiiGJUsvj6DPAMpump 4-layer architecture from cloud reasoning to 50Hz servo loops STM32 + PPO/SAC reinforcement learning + sim-to-real transfer not a render. not a pitch deck. 32 commits deep, hardware shipping. $ASR
English
3
1
3
168