Teslascope

29.7K posts

Teslascope banner
Teslascope

Teslascope

@teslascope

A complete view of everything about your Tesla vehicle. We are the worldwide drivers' platform and highly rated vehicle companion. 🚘 Not affiliated with Tesla.

Worldwide انضم Nisan 2019
144 يتبع72.3K المتابعون
تغريدة مثبتة
Teslascope
Teslascope@teslascope·
Our account, @vehiclescope, was finally unsuspended two days ago (confirmed by X Support), and is now suspended AGAIN for the same reason. Please look into how your automated systems function, @nikitabier @elonmusk. This is negatively impacting legitimate businesses on X.
Teslascope tweet mediaTeslascope tweet media
Teslascope@teslascope

This morning, we were contacted by X to confirm that we are the authorized representative of @vehiclescope (+ other sensitive documents). This update is encouraging, and we'd like to thank everyone for their continued support. For transparency, here is the letter we sent to @X.

English
22
13
203
36.3K
Teslascope
Teslascope@teslascope·
@LikeToasters @LinkN01 @gatunoteproton This may be best answered from one of our other posts! x.com/teslascope/sta… tldr; looks like a sign of pressing the accelerator to force FSD to bypass its speed profiles
Teslascope@teslascope

We review cases like this all the time, and our entire career is spent managing and studying data from these vehicles through our platform (when either consented by vehicle owners or in de-identified form). Multiple test drives have been taken within the past 24 hours at this location on both the latest and older FSD lane assist stacks (including older than the one installed in this incident). In both cases, the vehicle traveled at a slower speed (35-49mph) than in the incident (54-68mph at the 4-second prior mark). This implies that the driver was pressing down on the accelerator, forcing the vehicle to accelerate faster than it would normally. This action does not deactivate lane assist/FSD, but the system treats it as a manual override (as it is). Given the speed of travel, the turn ahead and posted speed signs (and that the vehicle was traveling at 3-4x the posted speed), this all supports the narrative that the driver was not paying attention to the road and was potentially driving recklessly. If the vehicle was operating normally and at its own suggested speed, it would have likely handled the corner turn with ease, as has been tested on the same vehicle, as well as other models. All speculative, all an opinion of course.

English
1
0
0
23
Teslascope
Teslascope@teslascope·
Tesla reported that Autopilot (FSD) was disengaged four seconds before impact, which is the entire duration of the top video, so that was not FSD at the time. FSD can also be forced to drive faster than it recommends if you press on the accelerator, which will not disengage unless an unavoidable collision is detected due to your manual action.
English
1
0
0
17
Jake Karll
Jake Karll@JakeKarll·
100%. Between your telemetry data, countless unsuccessful attempts to reproduce on v12.6 and v14, personal experiences of FSD just not behaving in that way, and Elon himself stating it was not engaged, there is a clear correlation here. Is there any counter evidence? I’m unaware of any.
English
1
0
0
16
Teslascope
Teslascope@teslascope·
@Untangling_X Yes, increase in reports on all recent builds unfortunately.
English
1
0
0
10
none
none@Untangling_X·
@teslascope Car reverted all settings ( PIN to drive, Glovebox, overheat protection, sentry mode) after 2026.8 are there report of this happening? (2019 Model 3 HW3)
English
1
0
0
15
Gravity Analytica Capital
Gravity Analytica Capital@GravityAnalyti1·
Looks to me like the driver realized the truck was going way to fast for the curve and tried to take control from the lane assist system but it was too late to avoid the accident. Four seconds before the crash the truck was traveling far too fast for that corner. This guy has 7 name changes as well.
Overly Trev@OverlyTrev

DEBUNKED FUD for Cybertruck on FSD crash. Elon confirms this was human error. The log shows FSD was disabled 4 seconds before the crash, so this entire video is literally ALL HUMAN DRIVING! I knew it from the first time I saw the video—this was 100% human error, manual driving.

English
16
1
13
15.6K
Teslascope
Teslascope@teslascope·
The final sentence in our post is critical. While we frequently review vehicle telemetry from Tesla vehicles like this and can make assumptions based on patterns we recognize, our post is still just an educated guess. Data is king. Unless this vehicle was connected to a third-party app (and the owner shares such recorded telemetry on their own accord), there’s nothing better than data. The vehicle itself recorded much more than the Autopilot disengagement data, but unless they are subpoenaed (or volunteer it thenselves as a defense), I’m unsure we’ll ever see this data. As with all cases such as this, time will tell.
English
1
0
1
29
Tesla App Updates (iOS)
Tesla App Updates (iOS)@Tesla_App_iOS·
Took the time this morning to do something that will save me a lot of time in the future. For some reason X does not give you a way to mass copy your current subscribers usernames (which we use for the article shout-outs). So we wrote a Firefox extension to save us time, just browse to the subscriber page and it will fetch all the subscribers for you, also checks if they are verified. @Aureliius pointed out that the OCR method we were using before kept getting his username wrong, which lead us to develop this solution! Anyways hope you all are having a wonderful week, appreciate you all!
Tesla App Updates (iOS) tweet media
English
2
0
18
3.1K
Chad Moran
Chad Moran@ChadMoran·
@teslascope @GravityAnalyti1 That's interesting because even on FSD 14 my Model Y tired to take an impossible turn because it thought It was on the other side of a barrier. I'm excited to continue this discussion when I finish gathering information. :)
English
1
0
1
8
Teslascope
Teslascope@teslascope·
There’s been a long-standing function related to construction (going back as far as V12.3) where when the vehicle detects a road narrowing or change that differs from lane availability from mapping data, it takes a more conservative approach until the conditions are settled. This is what has allowed FSD, even for HW3 vehicles, to navigate construction conditions with reduced lanes with relative ease, because it can adapt the same a human would. So in all videos of this incident, including the longer one not quoted above, this further would have provided context to the pathing NNs to reduce speed or be cautious. There are so many aspects of FSD that would have had to all fail in a spectacular manner that makes it much more probable to attribute manual action (such as pressing on the accelerator) as the cause of failure.
English
0
0
1
17
Teslascope
Teslascope@teslascope·
Correct. What you see visually on the map and what the vehicle is planning for via ego are often the same, but many, many occasions where it will not. Even if the CT thought it was still on the highway, it would have felt the incline, seen the shoulders/road edges, and also would have reduced speed as it went up. If placement failed as you describe more often, we would have seen Teslas crashing full speed at intersections or off-ramps far more frequently on V13. That’s why there’s been so many more complaints with V14, because they reworked the NN responsible for “where I need to be on the road to get from A to B” causing navigation routing differences from typical and repeated routes.
English
2
0
1
31
Teslascope
Teslascope@teslascope·
The braking in tunnels in Seattle (which we experienced many a time when living in Seattle years ago on FSD ~V10-11) should never be due to mis-negotiating a turn/curve. Even with all the merges on the 5.. When was the last time it slammed on the brakes and what were the road conditions?
English
1
0
0
21
Chad Moran
Chad Moran@ChadMoran·
> GPS shifts or bad mapping data is possible but this 1) only impacts navigation routing visually for the customer 2) would not impact decision making of ego. 3) When a vehicle misses a turn on FSD (V13+), it still continues driving but then does its best to figure out the next logical step. Factually incorrect. I've had this happen where my Model Y would SLAM on the brakes in a tunnel under Seattle. Also, look at this video. How do you explain this behavior then? x.com/AndyZeGerman/s…
English
2
0
0
34
Teslascope
Teslascope@teslascope·
@CurlyRunnerEric @GravityAnalyti1 Correct. This is twofold as it also prevents the vehicle’s ability to apply non-Emergency brakes, so AEB would have only triggered at the last second ~1-2 seconds prior. Ego reaches a point where it can determine the manual acceleration no longer can result in a safe maneuver.
English
0
0
1
31
Eric
Eric@CurlyRunnerEric·
@teslascope @GravityAnalyti1 Whaam Baam has shown stories before, holding down the accelerator around a curve can disengage FSD and it's counted as a manual disengagement.
English
1
0
1
220
Teslascope
Teslascope@teslascope·
V14, and to a lesser degree V13 both have conditional parameters that influence speed profiles, much to the love and woe of owners. Less cars and no debris/pot holes reported recently by other vehicles via TN will typically elicit a ~5-10% higher max speed (on Hurry or Mad Max) combined with many other factors. But the smoking gun is its mishandling of the turn, and knowing that FSD was deactivated four seconds prior. Speed Profiles still must comply with ego’s pathing, which has handled curves and turns effortlessly in V13 and V14; if a turn is approaching, the vehicle will adjust the speed profile otherwise we’d have cars crashing every minute, every day, when trying to make a right turn at an intersection. The speed not decreasing shows that some other factor was in play to prevent the vehicle from slowing down appropriately for the upcoming turn/curve. With most reports we work on with our customers (for both accidents and speeding tickets), applying pressure to the accelerator is common, which prevents non-Emergency braking and can make FSD catastrophically fail to navigate a turn. Disengaging FSD at the four second mark also shows that the driver had: - Ignored the speed limit sign. - Unbothered by the speed the vehicle was going as it entered the incline. - Was not aware or paying attention that a turn was coming up. FSD can make mistakes, and does all the time; but when there’s this many signs of potential misuse, we just want to put this all on the table for analysis sake.
English
0
0
3
63
The Cyber Stud
The Cyber Stud@TheCTStud·
Not trying to defend the idea that FSD is responsible for this crash (it isn't), but this statement below is very misleading: "In both cases, the vehicle traveled at a slower speed (35-49mph) than in the incident (54-68mph at the 4-second prior mark). This implies that the driver was pressing down on the accelerator, forcing the vehicle to accelerate faster than it would normally. This action does not deactivate lane assist/FSD, but the system treats it as a manual override (as it is)." No, it does not imply that. FSD drives at different speeds for me on the same roads even in the same speed profile! It's really random like there's some ghost variable impacting speed decisions.
English
0
0
2
185
Teslascope
Teslascope@teslascope·
Whether V12, V13, or V14 (in this instance, the vehicle was running V13, assumedly V13.2.9), Ego has consistently predicted and adjusted for curves in planning, especially for an off ramp/interchange. This has been evident in all test drives of this route since the incident. GPS shifts or bad mapping data is possible but this 1) only impacts navigation routing visually for the customer 2) would not impact decision making of ego. 3) When a vehicle misses a turn on FSD (V13+), it still continues driving but then does its best to figure out the next logical step. On V13/V14 this should practically never result in a “oh sh*t” scenario and the car magically crashes into a parked car. It’s where to go mind is seperate from its ability to make sense of the road ahead.
English
1
0
1
78