Luxonis | Robotic Vision

2.7K posts

Luxonis | Robotic Vision banner
Luxonis | Robotic Vision

Luxonis | Robotic Vision

@luxonis

Robotic vision, made simple.

Denver, CO Katılım Aralık 2012
169 Takip Edilen3.3K Takipçiler
Sabitlenmiş Tweet
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
Introducing OAK 4 + Luxonis Hub AI can only solve the problems it can see, and putting sensors, compute, and vision into the real world has always been the hard part. OAK 4 changes that. Powered by @Qualcomm, it delivers 52 TOPS on-device, runs multiple models in parallel, and works fully standalone with no host PC or cloud. With Luxonis Hub, you can deploy models, update apps, and manage devices remotely with zero setup. It’s time to see what you’ve been missing. Learn More: luxonis.com/oak4
English
3
21
46
8.2K
Luxonis | Robotic Vision
Maintaining stereo depth calibration in extreme environments just got easier. With DepthAI 3.5, we are introducing the AutoCalibration host node. This early developer release dynamically verifies and improves stereo alignment during runtime—no manual intervention needed. Enable it instantly with zero code changes: DEPTHAI_AUTOCALIBRATION=CONTINUOUS Or use the explicit Python/C++ API for full control. We are sharing this early so you can test it in high-vibration or variable-temp deployments. Try it out and share your feedback on GitHub to help us shape the final release. 🗞️Full Release: discuss.luxonis.com/blog/6759-auto…
GIF
English
0
2
8
330
Luxonis | Robotic Vision retweetledi
Max Mclaughlin
Max Mclaughlin@MaxMclaughlin16·
Sharing a super cool use case from our partners at @PixeeMedical, who’ve built an AR headset using OAK-1. Not only is the device sleek, but the operating room is one of the most challenging environments for computer vision, with reflective tools, harsh lighting, and zero margin for error. Super impressive work from the team. Insane to think OAK has guided surgeons in over 10,000 surgeries. I wonder how many more until that data can train a fully autonomous robot to do it. Read the full story: discuss.luxonis.com/blog/6734-pixe…
Max Mclaughlin tweet mediaMax Mclaughlin tweet mediaMax Mclaughlin tweet media
English
0
1
3
292
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
New Case Study: Powering Wearable AR for Surgical Navigation. The operating room demands absolute perception stability. Pixee Medical chose the Luxonis OAK 1 to power Knee+, an ultra-compact AR headset giving orthopedic surgeons real-time 3D navigation. By leveraging our SDK and on-device processing, Pixee achieved stable tracking despite glaring surgical lights and metallic reflections, all within a lightweight wearable. Read how edge vision is enhancing clinical outcomes in over 10,000 surgeries: discuss.luxonis.com/blog/6734-pixe… #ComputerVision #EdgeAI #SurgicalTech #AR
Luxonis | Robotic Vision tweet mediaLuxonis | Robotic Vision tweet mediaLuxonis | Robotic Vision tweet media
English
1
1
2
170
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
We just added three new real-world HubAI examples to the Hubstore, providing ready-to-use foundations for your edge vision pipelines. • Data Collection (Snaps): Capture and structure training-ready datasets directly from your OAK device to streamline custom model training. • Dino Tracking: Interactively select and track any object in the frame dynamically, no predefined classes needed. • People Demographics: Run multiple parallel models for retail analytics, including age, gender, mood, and face tracking. Check out the apps and the new Snaps workflow here: discuss.luxonis.com/blog/6720-new-…
Luxonis | Robotic Vision tweet media
English
0
2
2
278
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
We are thrilled to power the "See Beyond" track at GDG AI HACK 2026, where you can get hands-on with the Luxonis OAK 4 to build real-time Spatial AI applications. Apply by April 17 to secure your spot and start building in 3D! gdgaihack.com
Luxonis | Robotic Vision tweet mediaLuxonis | Robotic Vision tweet mediaLuxonis | Robotic Vision tweet mediaLuxonis | Robotic Vision tweet media
English
0
0
0
154
Luxonis | Robotic Vision
⏳ 1 week left to submit to the @MaCVi_Team CVPR 2026 Challenge! Segment the LaRS dataset (Sky, Water, Obstacles) for a chance to win $500 in Luxonis hardware. Models are evaluated directly on our next-gen RVC4 architecture. Due March 15: #ComputerVision" target="_blank" rel="nofollow noopener">macvi.org/workshop/cvpr/… #EdgeAI #CVPR2026
English
0
4
5
233
Lukas Ziegler
Lukas Ziegler@lukas_m_ziegler·
The global farm labor crisis isn’t new, but automation’s response finally is. From autonomous tractors to drone-mounted crop scouts, robots are now performing end-to-end workflows: 🌱 Planting 🧠 Monitoring 🤖 Weeding 🌾 Harvesting And they’re getting smarter with each season.
English
6
25
146
8.9K
Lukas Ziegler
Lukas Ziegler@lukas_m_ziegler·
🧵 Farming robots are no longer experimental. They're deployed, profitable, and reshaping agriculture. In orchards, vineyards, vegetable fields, and beyond, they're tackling labor shortages, precision spraying, and chemical reduction at scale. This is how robotics is quietly becoming the backbone of next-gen agriculture [Save this thread for later 📌]
English
300
1.1K
4.5K
365K
Axel
Axel@ax_pey·
we made our robot calibrate its depth camera by holding a sign with its little arm you only have to put a cardboard in its hand and press a button in the app pretty cool :)
English
61
118
3.3K
174.9K
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
How do you teach a humanoid robot? You show it. 🤖👓 Pollen Robotics is redefining embodied AI with Reachy 2. To get seamless, low-latency VR teleoperation, they needed the robot to see like a human. Enter the Luxonis OAK FFC-4P. 👇 By separating our edge compute board from the sensors, Pollen achieved: - 64mm stereo baseline (true human eye distance) - On-device stereo rectification & compression - Blazing-fast 125ms total latency Now part of Hugging Face, Pollen is using this exact vision stack to move Reachy 2 from teleop to true autonomy! 🚀 Read the full story: discuss.luxonis.com/blog/6709-poll… #Robotics #ComputerVision #EdgeAI #Luxonis #HuggingFace #EmbodiedAI
English
1
1
4
511
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
#ROS2 running fully onboard OAK4-D. No external computer required. In our latest demo, OAK4-D handles: • Person detection • Spatial coordinates • Velocity generation • Publishing to /cmd_vel All perception + control logic runs directly on the device. Packaged as a Luxonis App, so you can deploy and manage via Hub with OTA updates. This demo shows the workflow. What you build, and scale, it is up to you. discuss.luxonis.com/blog/6695-runn… @OpenRoboticsOrg
English
1
6
97
5.6K
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
Classical stereo or neural depth? You don’t have to choose. Neural-Assisted Stereo (NAS) combines fast SGM with lightweight LENS guidance to deliver high-resolution, high-fill depth at real-time speeds. Up to 45 FPS. Better low-texture depth. Fewer artifacts. Built for the edge. Full Release: discuss.luxonis.com/blog/6656-neur…
GIF
English
0
0
7
703
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
OAK being used for teloperation with... brain-controlled exoskeletons?! The world of Star Trek is coming
Lukas Ziegler@lukas_m_ziegler

Brain-controlled exoskeletons to train humanoid robots! 🧠 @FourierRobots just presented human tele-operators using brain control interfaces and exoskeletal arms to train humanoid robots on home tasks. The brain control interface is the interesting part. Instead of using a controller or joystick to teleoperate, the operator's movements and intentions are captured more naturally through the exoskeleton and BCI. This means the demonstrations are more fluid, more human-like, and better suited for training robots to perform delicate home tasks. Multiple tele-operators are simultaneously generating training data across multiple robots. This is how you build the dataset needed for eventual full autonomy, without waiting years for it to arrive. This might be the bridge between "robots that work in controlled environments" and "robots that work in homes." Not full autonomy right away, but trusted human intelligence operating through a robot body, getting better with every task completed. ~~ ♻️ Join the weekly robotics newsletter, and never miss any news → ziegler.substack.com

English
0
0
2
366
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
Taking edge compute seriously means designing for where it actually runs. We just shared a real off-grid, solar-powered OAK 4 D deployment running continuously with no mains power, no host PC, and minimal connectivity. Because OAK runs vision fully on-device, only inference results are sent over cellular, not raw video. That keeps bandwidth low and removes the need for a power-hungry host computer. A single PoE cable handles power and data. The system runs ~60 hours without sunlight and recovers automatically when solar returns. A practical edge setup for remote monitoring, agriculture, construction, and field research.
English
1
0
1
193
Luxonis | Robotic Vision
Luxonis | Robotic Vision@luxonis·
Had a great time at #CES2026—but not as much fun as some of our OAKs playing ping pong and piano. It was great meeting many of you, exploring new technologies, and feeling the momentum around physical AI and real-world automation. Leaving CES excited for what 2026 has in store. Got any projects on your horizon requiring robotic perception or machine vision? We're happy to chat about how OAK can bring automation into the 3D world your business actually operates in: meetings-eu1.hubspot.com/cj-mann/discov… P.S. If we missed your OAK demo at the show, let us know 👇
English
0
0
4
354
Luxonis | Robotic Vision
Sharing our partner @dataguess, who integrates their no-code Inspector software directly into OAK 4 S to deliver an all-in-one, self-contained solution for streamlined quality assurance and industrial automation. As they put it: “We wanted an all-in-one, compact solution that could be easily deployed on the factory floor. Moving to RVC4 made that possible, delivering lower latency and strong real-time performance—even with larger models.
Luxonis | Robotic Vision tweet media
English
1
2
3
334