Vignesh Anand retweetledi
Vignesh Anand
9 posts

Vignesh Anand
@Vignesh_Robots
Innate (F24). Former Graduate Student @stanford & Embodied AI and Robotics Lab.
Stanford, CA, USA Katılım Haziran 2024
58 Takip Edilen433 Takipçiler
Vignesh Anand retweetledi
Vignesh Anand retweetledi

Introducing MARS, the first Personal AI Robot.
In technology, getting to the future we want starts with building it together.
This is why we designed Mars. Inspired by early PCs, Mars is a powerful, complete, extendable robot.
It runs BASIC, an open embodied AI agent we created for you to program with code, demonstrations, and prompts. This allows Mars to perform complex long-horizon tasks involving spatial memory, reasoning, manipulation and navigation.
Pre-orders open now
⬇️ Demos & details in the thread
English
Vignesh Anand retweetledi

This is the year of mobile manipulators 🤖
We are soon releasing a small, affordable, open-source one for the AI & robotics community
• Onboard compute
• Smooth teleop
• Complete suite of sensors + extensible
Join our discord on @innate_bot's profile to build with us!
English
Vignesh Anand retweetledi

Experiment of the day: How would @danylo_movchan's dog react to being fed by a robot?
Maybe one day they will be friends
English
Vignesh Anand retweetledi

Can you teach your mobile robot without a computer in 30 seconds?
Yes, with a phone connected to a teleop arm! 🦾📲
Here grabbing up a ping-pong ball for @Vignesh_Robots
Want one?
Come build in the open with us @innate_bot !
English

@meetsitaram @ax_pey Jetson orin nano is the onboard computer.
English

@ax_pey @Vignesh_Robots Awesome. What hardware do you use for making it portable? Some raspberry pi or nvidia jetson?
English

We gave a robot body to a GPT-4o AI agent.
Here serving a glass from a human to another🍹
With only:
• 30mn of arm demonstrations
• A language prompt to guide it
You can also teach it to clean, play, check in on you.
What would you teach it?
with @Vignesh_Robots
English

@Aadhithya_D2003 @ax_pey All arm motion is based on a single camera and uses the latest in robot learning research. It is an end to end manipulation policy trained with about 30 mins of demonstrations.
English

@ax_pey @Vignesh_Robots Makes sense. How are you guys calculating the degree to which the arm's motors need to move? We are using the video feed to find the position of the object and find the distance of the subject from the actuator of the arm and based on that we are calculating the angle required.
English


