Anthony Smith

15.7K posts

Anthony Smith banner
Anthony Smith

Anthony Smith

@anthonyisonline

Product/ Industrial Designer | Working on Electronic Products & Health-Tech | Side-hustle buying & selling Apples, Lemonade &... Tesla 😉

Katılım Temmuz 2009
2.5K Takip Edilen1.4K Takipçiler
Anthony Smith retweetledi
Miss Jilianne
Miss Jilianne@MissJilianne·
The reason Tesla’s Full Self-Driving can go thousands of miles between reported accidents is because the human supervisor intervenes when the system makes critical errors.
Miss Jilianne tweet media
English
21
9
71
1.8K
Anthony Smith
Anthony Smith@anthonyisonline·
@FredLambert 💯 Fred Tesla’s approach breaks every rule in the UI/UX testing handbook. They’re not testing apps here. It’s safety- critcal. User complacency is a thing. No amount of asterisks from Tesla change this. They shift the blame onto the non-expert, non-incentivized test participant
English
0
0
0
11
Anthony Smith retweetledi
Fred Lambert
Fred Lambert@FredLambert·
Tesla fans using the “4-second disengagement” as a gotcha are missing the forest for the trees. Yes, the driver was technically in control of the vehicle at the moment of impact. But she was in control because FSD was already failing by driving too fast ahead of this sharp turn — it was heading straight into a concrete barrier at highway speed with no sign of correcting. Everyone who has frequently used FSD or Autopilot and paints this 4-second disengagement as a “gotcha” moment is being disengenous, and that includes Elon Musk. I have tens of thousands of miles on FSD, and I’ve experienced the system coming too fast into a turn at least half a dozen times. We’ve said this before and we’ll keep saying it: the problem with FSD isn’t what happens when the driver is paying attention and the system works. The problem is what happens when the system gives you every reason to trust it, and then suddenly doesn’t work. The driver has to recognize the failure, assess the situation, decide on a correction, and physically execute it, all in less time than the system needs to create the danger. Musk and Tesla’s propagandists can point to the logs all they want. The video shows what actually matters: FSD approaching a standard highway curve at full speed with zero indication it was going to navigate it. That’s the failure. Everything that happened after, including the panicked disengagement, is a consequence of that failure. The framing that this was “manual driving, not FSD” is technically true for the final 4 seconds and deeply dishonest about the full sequence of events. It’s exactly the kind of liability shell game that courts are increasingly rejecting, as that $243 million verdict makes clear. Tesla created the system, sold it as “Full Self-Driving,” and profits from the ambiguity. At some point, it has to own the consequences.
Electrek.co@ElectrekCo

Tesla says FSD was off before Cybertruck crash — but the video tells a different story electrek.co/2026/03/18/tes… by @fredlambert

English
210
331
3.5K
239.5K
Anthony Smith retweetledi
Alan Eyre
Alan Eyre@AlanEyre1·
“Although President Donald Trump says he has ‘destroyed 100% of Iran’s Military Capability’, the 0% that remains is playing havoc with the global economy.” -The Economist
English
166
6.9K
38.1K
914.7K
Anthony Smith
Anthony Smith@anthonyisonline·
@ICannot_Enough @28delayslater There are major UX issues in the implementation of ‘Full’ Self Driving. Tesla are hiding behind asterisks, shifting blame to the non- expert, not incentivized tester (member of the public) It’s a risky experiment & goes against every UI/UX rule in the book.
English
0
0
0
20
Earl of FrunkPuppy
Earl of FrunkPuppy@28delayslater·
FSD crashes into a barrier. Details in first comment
English
50
9
78
16.6K
Anthony Smith retweetledi
Crazy Fenak
Crazy Fenak@CrazyFenaker·
@aleksbrz11 So that $400M plane they gifted to Trump was literally the worst bribe/investment of all time.
English
11
73
2.4K
63.8K
Adam Schwarz
Adam Schwarz@AdamJSchwarz·
Reporter: Why didn't you notify Japan that you were going to attack Iran? Trump next to the Japanese PM: "Who knows better about surprise than Japan? Why didn't you tell me about Pearl Habour? You believe in surprise I think much more so than us."
English
191
458
1.3K
265K
Anthony Smith retweetledi
Alon Mizrahi
Alon Mizrahi@alon_mizrahi·
Day 1: it's going to take a couple of days Day 20: ok we need 200 billion dollars
English
263
7.8K
49.4K
609.8K
Anthony Smith
Anthony Smith@anthonyisonline·
@ChadMoran @ThisisJoeWilson Yeah it’s super clear for any potential customers buying this 😂😂😂 Full Self driving…. Errrm supervised full self driving …. A major reason why it’s not allowed in Europe. Implied product claims
English
0
0
0
10
Anthony Smith
Anthony Smith@anthonyisonline·
Wow. Confirmation from inside one of the big incumbents that they’re way behind @Lemonade_Inc in terms of tech. $LMND
Hannes@giovannibfc

@PaperBagInvest I work for a big "legacy" insurer. Catching up will take ages, and may be impossible, even though many billions are spent. Our IT systems are from the 70s, and they don't speak to each other at all. Any development is extremely slow

English
0
0
15
1.8K
Anthony Smith
Anthony Smith@anthonyisonline·
@ChadMoran It's possible that there's a fundamental UI/UX oversight in Tesla's implementation. Alongside complacency, you've got to factor in reaction time to a disengagement & how clear this disengagement is (esp to an un-engaged overseer). Should've been tested. x.com/anthonyisonlin…
Anthony Smith@anthonyisonline

@ChadMoran Supervised FSD has always been a risky experiment. Testing safety-critical user- interfaces with non-expert, non-incentivized test participants is not good practice in UI/UX design. It's one of the main reasons FSD is not allowed in Europe (alongside the name- implied claims)

English
0
0
0
26
Chad Moran
Chad Moran@ChadMoran·
It was 4 seconds from the time of FSD disengagement until the impact. However, it was 2 seconds from disengagement until the start of the turn. Which FSD should have slowed down for. That's 170 ft to go from 60 MPH to 20 MPH. Or about 0.63g of stopping force... that's hard braking.
Chad Moran@ChadMoran

Here's the original video, which looks like it's actual speed. I used two markers and consistently measured it. It took 75 frames in a 30 FPS video to travel 212.63 ft equating to 57.9 MPH (probably 60). The incident occurred in August of 2025. These are just facts. Yes the driver should have intervened sooner. Yes, FSD was approaching way too fast.

English
30
4
60
15.4K
Chad Moran
Chad Moran@ChadMoran·
Here's the original video, which looks like it's actual speed. I used two markers and consistently measured it. It took 75 frames in a 30 FPS video to travel 212.63 ft equating to 57.9 MPH (probably 60). The incident occurred in August of 2025. These are just facts. Yes the driver should have intervened sooner. Yes, FSD was approaching way too fast.
Chad Moran tweet mediaChad Moran tweet mediaChad Moran tweet media
Chad Moran@ChadMoran

I normally don't bother armchair quarterbacking these situations. But watching the Tesla Stans come into the comments is funny. Elon said Autopilot (FSD?) was disengaged 4s before the impact which if you watch the video looks like it was going way too fast at that point. Am I missing something?

English
113
23
306
99.6K