Wade Dorrell

22.1K posts

Wade Dorrell banner
Wade Dorrell

Wade Dorrell

@waded

Quality, human factors, software, gardening, denominators, late-stage dadops. Ex MSFT, data-driven startups. Idaho lifer, Lego, LLAP, LLTBTB.

Boise, Idaho Katılım Mart 2008
635 Takip Edilen1.3K Takipçiler
Wade Dorrell retweetledi
U.S. Graphics Company
U.S. Graphics Company@usgraphics·
Interactive research published by Anthropic is truly outstanding. There is this resounding Tuft x Bostock aesthetic that appears everywhere and there is a high degree of density + consistency across the board. Only a product of people caring.
U.S. Graphics Company tweet media
Anthropic@AnthropicAI

We invited Claude users to share how they use AI, what they dream it could make possible, and what they fear it might do. Nearly 81,000 people responded in one week—the largest qualitative study of its kind. Read more: anthropic.com/features/81k-i…

English
6
13
218
19.4K
Wade Dorrell
Wade Dorrell@waded·
@TimDOES (The autopark UI can literally only see so far, illusion of illusion of choice. Whereas that supercharger spot visualization they're cooking is like airplane seat selection, premium upsell surface area people can start using 5 miles out. Some parking lots will become that.)
English
0
0
0
24
Wade Dorrell
Wade Dorrell@waded·
@TimDOES Yeah exactly. The ideal might be that we never see what spot it selects at all. Preference about drop off and pickup needs, and ticket avoidance, as opposed to supervisory last second spot A vs B stuff like current autopark.
English
1
0
0
14
Wade Dorrell
Wade Dorrell@waded·
@Nightstalker89 @Teslarati Yeah of course. Imaginary car that gets more range vs imaginary car that charges faster, no particular consequence or condition either way, is how I understood the question.
English
0
0
0
8
Nightstalker89
Nightstalker89@Nightstalker89·
@waded @Teslarati Average daily drive is less than 50 miles. So this only counts on road trips. Bigger pack = less efficient = wasted money on electricity.
Saylorville, IA 🇺🇸 English
1
0
0
11
Wade Dorrell
Wade Dorrell@waded·
@ChadMoran @ThisisJoeWilson Right, but "autopilot" was never on truck, and isn't the name of anything they sell today because of the December California decision. Elon says "autopilot" quite on purpose.
English
0
0
1
11
Chad Moran
Chad Moran@ChadMoran·
@waded @ThisisJoeWilson I think it's because Autopilot is a suite of features, not a specific one. Autosteer is the one people are thinking of.
English
1
0
0
9
Joseph Wilson
Joseph Wilson@ThisisJoeWilson·
@ChadMoran Elon called it autopilot we need to figure out why he said that.
English
2
0
1
56
Wade Dorrell
Wade Dorrell@waded·
@OverlyTrev (I had to train myself NOT to use the stalk to disengage because it doesn't do the thing that gives me time to fix it fast enough.)
English
0
0
1
53
Wade Dorrell
Wade Dorrell@waded·
If driver hit brake to disengage, yes. Given the speed continues through the first 3 seconds of disengagement, it was probably done with the button, which doesn't slow immediately, giving time to feather the go pedal. Panic ate 2 seconds, indecision about left vs. right vs. brake ate another second. 1 second left.
English
4
0
25
2.8K
Overly Trev
Overly Trev@OverlyTrev·
The driver disengaged FSD 4 seconds before impact — the rest was 100% manual driving. So the entire clip was manual driving! I did the math: the driver could have stopped in time from the first frame of the video. Here’s how I reached that conclusion. Debate if you have thoughts. 1. In the very first frame, the truck was 510–560 feet from the concrete barrier. I estimated this frame by frame using the vehicle’s speed on US-69/59 Eastex Freeway and local map data. 2. Speed was a steady ~60 mph. Over the ~6-second clip, distance to impact was estimated using the Cybertruck’s size and how quickly it approached frame by frame. It took ~4 seconds to reach the barrier from the first frame. 3. Cybertruck braking is strong — real-world tests (MotorTrend) show it stops from 60 mph in 126 feet (~176–187 ft from 70 mph per Car and Driver). At 60–65 mph, it needs only ~130–160 feet to stop fully. With ~510 feet available when the driver disengaged, it could have stopped easily, even for a poor driver. 4. Physics did the rest: at that distance and speed, the driver simply ran out of room and panicked. Again, not FSD. All human manual driving error. The Cybertruck could have braked in time based on distance and speed.
Fred Lambert@FredLambert

Tesla fans using the “4-second disengagement” as a gotcha are missing the forest for the trees. Yes, the driver was technically in control of the vehicle at the moment of impact. But she was in control because FSD was already failing by driving too fast ahead of this sharp turn — it was heading straight into a concrete barrier at highway speed with no sign of correcting. Everyone who has frequently used FSD or Autopilot and paints this 4-second disengagement as a “gotcha” moment is being disengenous, and that includes Elon Musk. I have tens of thousands of miles on FSD, and I’ve experienced the system coming too fast into a turn at least half a dozen times. We’ve said this before and we’ll keep saying it: the problem with FSD isn’t what happens when the driver is paying attention and the system works. The problem is what happens when the system gives you every reason to trust it, and then suddenly doesn’t work. The driver has to recognize the failure, assess the situation, decide on a correction, and physically execute it, all in less time than the system needs to create the danger. Musk and Tesla’s propagandists can point to the logs all they want. The video shows what actually matters: FSD approaching a standard highway curve at full speed with zero indication it was going to navigate it. That’s the failure. Everything that happened after, including the panicked disengagement, is a consequence of that failure. The framing that this was “manual driving, not FSD” is technically true for the final 4 seconds and deeply dishonest about the full sequence of events. It’s exactly the kind of liability shell game that courts are increasingly rejecting, as that $243 million verdict makes clear. Tesla created the system, sold it as “Full Self-Driving,” and profits from the ambiguity. At some point, it has to own the consequences.

English
33
1
13
83.5K
Wade Dorrell
Wade Dorrell@waded·
@ChadMoran 0s: disengage button (too late, mistake 0) 1s: why isn't it slowing down 2s: move foot to go pedal to moderate speed, like one does (mistake 1) 3s: see option to go straight, regen brake starts (both contradict need to move foot) 4s: half turn with foot half moved
English
1
0
1
16
Chad Moran
Chad Moran@ChadMoran·
Here's the original video, which looks like it's actual speed. I used two markers and consistently measured it. It took 75 frames in a 30 FPS video to travel 212.63 ft equating to 57.9 MPH (probably 60). The incident occurred in August of 2025. These are just facts. Yes the driver should have intervened sooner. Yes, FSD was approaching way too fast.
Chad Moran tweet mediaChad Moran tweet mediaChad Moran tweet media
Chad Moran@ChadMoran

I normally don't bother armchair quarterbacking these situations. But watching the Tesla Stans come into the comments is funny. Elon said Autopilot (FSD?) was disengaged 4s before the impact which if you watch the video looks like it was going way too fast at that point. Am I missing something?

English
113
22
305
99.5K
bitfloorsghost
bitfloorsghost@bitfloorsghost·
we ruined such a good thing
bitfloorsghost tweet media
English
718
6.1K
112.2K
8.1M
Wade Dorrell
Wade Dorrell@waded·
Amazing what AI can help us build. But remember that not everything has to be built. Demand in balance with supply, build in balance with maintain. Unbuild in balance with build? (skynet, you didn't hear it from me)
English
0
0
0
25
Wade Dorrell
Wade Dorrell@waded·
@Ryan_Turner_01 @nypost FWIW by some accounts this video predates the metadata/overlay. The video/incident isn't necessarily recent any more than it's the whole story on any other dimension.
English
1
0
0
14
MarzBus𝕏
MarzBus𝕏@Ryan_Turner_01·
@nypost It’s strange that the video had to be cropped, which now prevents us from determining whether the car is in Full Self-Driving mode or not.
English
1
0
1
119
Wade Dorrell
Wade Dorrell@waded·
@salmun_nister @teslascope > even PIN to drive was disabled Ah, finally an obvious reason for "Security Improvements" in release notes! 😬 (My profile seemed intact on 2026.8, but I wasn't the first driver of the car after it updated.)
Wade Dorrell tweet media
English
0
0
1
15
~salmun-nister
~salmun-nister@salmun_nister·
@teslascope Hopefully it's a fix for 2026.8 that is causing the tutorial videos to show up every time I get in the car. The update had a few issues: the car's name was deleted, some profile settings messed up temporarily (switching profiles fixed), and even PIN to drive was disabled.
English
1
0
0
109
Wade Dorrell
Wade Dorrell@waded·
Is hotdog fingers basically just fingers that happen to be near a hotdog?
English
0
0
0
5
Wade Dorrell
Wade Dorrell@waded·
Never really thought about Jordan 2345. Supposedly he picked 23 as an approximation of 45 * 0.5. Backed into the sequence. He should've grabbed 1 also.
English
0
0
0
7