There are some spectacular failures that are very scary because the car does something that a person would never do, unless unconscious, like this one: https://www.youtube.com/watch?v=LfmAG4dk-rU
You don't hear much about it probably because Tesla fanboys are plenty and rabid, so people avoid talking about it online.
The defence is usually stats about human drivers crashing more often and it makes sense until you dive deeper into the numbers because these stats are usually oranges v.s. Apples. It feels like they have some playbook with statistics to slap when someone says something negative. If that doesn't cut, they say that the victim should have followed the manual that says "your attention should always be on the road" then proceed posting a video about how thanks to the latest update they can sleep drive to work and attach a banana to the driving wheel to disable the attention safeguards.
If someone asks how is this an autopilot, there are usually two ways to handle it:
1) Autopilot is just a brand name, the self driving software is in beta, so the victim should pay attention all the times.
2) Autopilot is like on the planes, so only fools think that it is autonomous, therefore it was working as intended but they should have been using it like an airline autopilot. Crash due to user error.
I'm actually a fan of Musk and Tesla but I feel like the community engagement is very unhealthy and lacks scrutiny due to the "online army" of his.
I'm surprised you don't just link to the source that regularly updates itself on all Tesla deaths, and makes a point to cite its sources. Note this is not just for Autopilot, but all deaths occuring from Tesla vehicles (including people not in the vehicles). There are tags for various things like "Autopilot" and "Pedestrian/cyclist"
I'm optimistic for this area of tech and research in general, but agree we need to stop benchmarking against average human crash rates.
Although anyone can be hit at any time, the distribution of human crashes is not purely random. People who drive compromised, for example, are way overrepresented in those events.
So, theoretically, the tech could get to a point with a lower than expected crash rate for humans generally, but still increase your personal crash likelihood.
If you believe you are a better than average driver when manually driving, why couldn't you also be better than average at intervening when autopilot is driving?
Humans don't have forward facing radar nor 360 degree always active vision, so in some cases, cars can see hazards that even a perfect human driver cannot.
It's a natural human tendency to get distracted, especially when we are not engaged with a task (as with autopilot). Driving manually physically engages your body in the task, making it harder to get distracted. On the other hand intervening is subject to distraction and longer response times.
Yes, that's captured by the defences I listed above. You say that this is an autopilot, you make them purchase self driving capability package, you say that the car comes with all the hardware necessary for self driving, you fans post videos online about cars driving themselves and in the small print you say that it's not autopilot but Autopilot and the drivers must pay attention all the time and you put very weak safeguards to enforce that attention.
It's simple plausible deniability for Tesla and the fanboys. Good enough to keep them off the hook, legally.
That's such a hard reach and a bunch of word mincing.
To this day in 2021, many people still don't wear seat belts. Even though that's a solved problem. Some people will do stupid things. It's human nature.
Sure but car makers don't imply or give impression that you don't need seatbelts thanks to the collusion detection system or the airbags. They don't sell seatbelt-free system that only in the small print says that the seatbelt must be worn all the times except for off-road driving.
You're being a bit pedantic here. Every time you engage autopilot, it literally tells you (in bold letters) to "Always Keep Your Hands on the Wheel" and to "Be Prepared to Take Over at Any Time". Keep abusing it and it will actually disable it for the rest of the drive. Did you know that?
There's no "impression" being given. Like I mentioned. There will always be a small fraction of irresponsible humans that will do dumb things. No amount of engineering can fix that.
Case in point is this video: https://youtu.be/VS5zQKXHdpM?t=88 Her mom is even helping this kid film this stupid act just for clout and views.
Tesla owners grilled him and said what he's doing is dangerous and irresponsible. He then deleted them all and turned off commenting. But people like to demonize Tesla owners. Which I find bizarre.
I'm not so sure the second one is something a person would never do. The second car also brakes very late, for instance. Did autopilot break at the last second for the Tesla, or the driver? I'm also bearish on autopilot and am not saying the Tesla performed well, but I bet human highway drivers run into stationary traffic all the time too.
One has to distinguish between "autopilot" and "FSD". The "autopilot" is used for driving on highways and mostly uses radar for obstacle avoidance. It works great with handling traffic, that is cars moving around, but not for static obstactles. The problem is the low spatial resolution of the radar, so nonmoving obstacles are difficult to distinguish from the background reflections.
"FSD" is creating a 3d-model of the environment based on the camera image. But so far it is only active outside of highways. This should be much better at avoiding static obstacles. In any case, they are different systems so experiences with one cannot always be transferred to the other.
Yes - 'autopilot' is auto-steer combined with adaptive cruise control. Now, Tesla's technically do have 'dumb' cruise control that doesn't reduce speed in response to a car getting close, but that configuration is rare since you need to specifically call in to buy one without autopilot and is probably not what the article was referring to.
I think the parent comment was making the same point as you. We, but mostly Tesla, should be educating people on why autopilot doesn't mean it drives itself in all situations. While buying a car[0], the furthest the webpage goes is:
> automatic driving from highway on-ramp to off-ramp including interchanges and overtaking slower cars
And while it does all of this very well right now, there still is that piece of text at the bottom:
> The currently enabled features require active driver supervision and do not make the vehicle autonomous.
Honestly Tesla is doing the bare minimum here to have plausible deniability, but the text at the bottom really should be the same size as the text above it.
There are some spectacular failures that are very scary because the car does something that a person would never do, unless unconscious, like this one: https://www.youtube.com/watch?v=LfmAG4dk-rU
You don't hear much about it probably because Tesla fanboys are plenty and rabid, so people avoid talking about it online.
The defence is usually stats about human drivers crashing more often and it makes sense until you dive deeper into the numbers because these stats are usually oranges v.s. Apples. It feels like they have some playbook with statistics to slap when someone says something negative. If that doesn't cut, they say that the victim should have followed the manual that says "your attention should always be on the road" then proceed posting a video about how thanks to the latest update they can sleep drive to work and attach a banana to the driving wheel to disable the attention safeguards.
If someone asks how is this an autopilot, there are usually two ways to handle it:
1) Autopilot is just a brand name, the self driving software is in beta, so the victim should pay attention all the times.
2) Autopilot is like on the planes, so only fools think that it is autonomous, therefore it was working as intended but they should have been using it like an airline autopilot. Crash due to user error.
I'm actually a fan of Musk and Tesla but I feel like the community engagement is very unhealthy and lacks scrutiny due to the "online army" of his.