Ouster

816 posts

Ouster banner
Ouster

Ouster

@ousterlidar

Sensing & perception for Physical AI across industrial, robotics, automotive, and smart infrastructure (Nasdaq: OUST)

San Francisco, CA Katılım Ağustos 2018
274 Takip Edilen7.3K Takipçiler
Sabitlenmiş Tweet
Ouster
Ouster@ousterlidar·
We're entering a new era. Where machines don't just detect the world, they understand it. Introducing REV8:
English
10
53
390
68.9K
Ouster
Ouster@ousterlidar·
This is a Rev8 OS1 Max driving through an active work zone, capturing every sign, cone, and worker in native color with depth resolved at the silicon level. See the dataset in Ouster Studio: studio.ouster.com/share/X5VQVEG1…
English
0
0
14
903
Ouster
Ouster@ousterlidar·
With road work, lane lines deviate, signs change, and workers may stand in spaces that, a day earlier, were drivable lanes. Every cue that tells an autonomous system the rules have temporarily changed is encoded in colorful temporary signage or subtly in the environment itself. Until Rev8, capturing that color information required a camera stream that had to be calibrated, in real time, against a separate lidar.
English
4
10
90
13.4K
Ouster retweetledi
Cheddar
Cheddar@cheddar·
Ouster CEO Angus Pacala explains how next-gen LiDAR and physical AI are transforming robotics, automation, and autonomy. cheddar.com/media/why-oust…
English
1
4
40
8K
Ouster
Ouster@ousterlidar·
Hey San Francisco! ​Join us May 27th for a deep dive into Ouster's new REV8 sensors. We’re taking over a brewery in the Mission District for a technical exploration into our latest hardware, featuring a short overview on REV8, live demos, and an open Q&A with the engineering team. Come for the sensors, stay for the craft beer and community. Invite link in the thread 🍻
English
3
2
57
2.3K
Ouster
Ouster@ousterlidar·
With Rev8, color is perceived in the right place, at the right distance, at the right moment. Ouster L4 silicon captures color and depth in the same photon event, with Fujifilm color science embedded directly on the chip. This is a Rev8 OS1 Max driving the Embarcadero in San Francisco. Every traffic signal, brake light, and bike lane resolved in 48-bit native color, perfectly aligned with depth at the silicon level.
English
8
31
237
28.5K
Ouster retweetledi
The Road to Autonomy®
The Road to Autonomy®@RoadToAutonomy·
Angus Pacala, co-founder and CEO of @OusterLiDAR joined @gbrulte to discuss the launch of REV8, the first native color LiDAR, and how Ouster is becoming the sensing company for the autonomy economy. Episode Chapters 0:00 @AUTNMYAI 0:36 Changing LiDAR Industry 03:56 Introducing REV8 13:42 Building Trust with Safety-Critical LiDAR 17:53 Why Custom Silicon is Ouster's Moat 25:33 Color Science Behind REV8 33:28 Can Color LiDAR Replace Cameras? 36:36 StereoLabs Acquisition 40:07 Ouster as a Sensing Company 49:46 Defense Applications 52:14 Future of Ouster $OUST
English
1
19
89
40.1K
Ouster
Ouster@ousterlidar·
Team Ouster is at Xponential in Detroit! Come by booth 31036 to see what a 3D point cloud looks like when every point has RGB baked in
Ouster tweet mediaOuster tweet media
English
1
3
62
2.5K
Ouster
Ouster@ousterlidar·
"Ouster’s work with NVIDIA DRIVE centers on providing the high-performance sensing required for the next generation of autonomous vehicles." - Ouster CEO Angus Pacala. 🔗Read the press release here: businesswire.com/news/home/2026…
English
0
2
19
1.9K
Ouster
Ouster@ousterlidar·
Unlocking Next-Generation AV Development: ✅ Rev8 provides the world’s first native color lidar data necessary to train next-generation world models and enable safer autonomous navigation at scale. ✅ Flagship OS1 Max delivers 256 channels of high-definition sensing up to 500 meters in all directions. ✅Inherently fused color and depth data allows for automated annotation pipelines, reducing the time and cost of training autonomous systems. ✅Optimized plugins for the OS family within the NVIDIA DriveWorks SDK.
English
2
1
24
2.3K
Ouster
Ouster@ousterlidar·
🚨Ouster Brings REV8 Native Color Lidar to the NVIDIA DRIVE Hyperion Platform Our new Rev8 OS family of digital lidar sensors is now qualified to run on the @NVIDIADRIVE Hyperion platform, providing developers a streamlined path from development to full-scale deployment of Level 4 autonomous vehicles.
Ouster tweet mediaOuster tweet mediaOuster tweet media
English
3
30
171
20K
Ouster
Ouster@ousterlidar·
A tunnel exit takes a camera from ~1 lux to ~100,000 lux in under a second. Auto-exposure can't keep up, highlights blow out, and the first frames after the exit are unusable This is a Rev8 OS1 Max capturing the Yosemite Wawona Tunnel exit. 116 dB of dynamic range, hardware HDR, and color captured in the same photon event as depth, so every point holds its color through six orders of magnitude of lighting change See the dataset in Ouster Studio: studio.ouster.com/share/7TC61PHE…
English
7
40
430
51.6K
Trip Capital
Trip Capital@TripCapitalLabs·
@sinanisler @ousterlidar Well I think data collection and inference are done from the car, which makes sense. I’m just saying that this render here is from an artificially high view point (well above the car).
English
1
0
0
51
Ouster
Ouster@ousterlidar·
Today’s AV stacks infer traffic signal and brake light state from camera pixels, then try to align them to lidar geometry. Layers between sensing and decision. Humans see color and depth together because they arrive together. Rev8 does the same: 48-bit RGB + 3D depth, same photon, same silicon, already aligned.
English
10
21
292
125.6K
Ouster
Ouster@ousterlidar·
There's some confusion about occlusion and "shadows" in the comments. You can observe 3D data from any viewing angle, and at any viewing angle other than the exact origin point, you will see occlusions - shadows created by objects blocking what's behind them. The first video is exactly what a camera sees (occlusions and all), it's just novel for many to see the occlusions so clearly. Here's the first person view of the same drive, perspective from the origin like you'd get from a camera:
Ouster@ousterlidar

Today’s AV stacks infer traffic signal and brake light state from camera pixels, then try to align them to lidar geometry. Layers between sensing and decision. Humans see color and depth together because they arrive together. Rev8 does the same: 48-bit RGB + 3D depth, same photon, same silicon, already aligned.

English
2
15
129
13.7K
Ouster
Ouster@ousterlidar·
Ouster CEO Angus Pacala spoke to @SchwabNetwork this morning about our record quarterly results and the future of Physical AI, powered by our unified platform of new REV8 native color lidar sensors and @Stereolabs3D cameras. "We're firing on all cylinders... we're in the earliest innings here, which is just another strong promise for Ouster's future growth for years to come." schwabnetwork.com/video/ouster-c…
English
6
12
95
10.4K
Ouster
Ouster@ousterlidar·
@usppdd During the day. We don't get the 3D depth returns from they sky, but we can turn the sky points on/off in the visualizer for scene context. You'll increasingly see a mix of images from us, some with background context and some without.
Ouster tweet mediaOuster tweet media
English
0
1
7
619
Ouster
Ouster@ousterlidar·
REV8 is more than just a 3D depth sensor - it delivers uncompromising industrial-grade imaging. We're proud to announce our work with @DXOMARK, the leader in image quality testing and benchmarking, to maximize the performance of the world's first native color lidar.
Ouster tweet mediaOuster tweet mediaOuster tweet mediaOuster tweet media
English
3
14
113
13K