This thread will probably end well, but this has been my beef with Autopilot all along. The public is beta testing the cars, and they’re now projectile missiles when Autopilot fails and/or the driver (whose supposed to be in control, but thinks it’s cool the car can drive itself and sits back and relaxes instead) isn’t in command.
Do I fault the driver who wasn’t in control and should’ve been…absolutely. Do I also fault Tesla for utilizing the public to beta test these systems, as well as a fundamentally weak system to detect if the driver is paying attention…youbetcha.
Two Model S plaids caught fire, the first was a billionaire investor in Tesla, the other tweeted “nbd Tesla has my car now”.
If we joke it’s a death cult, it’s because the only convincing for many fans is decapitation by a box truck. The guy in SF who died said that Autopilot almost crashed multiple times on that particular merge.
Genuinely curious how all these accidents are possible when the Tesla makes you move the wheel every 15 seconds or so to make sure you are paying attention. Is there an easy workaround or are people just gaming it somehow.
There’s a video of it too where you can see the ford cutting the tesla off but the tesla does not react. I blame both but having your kid not wear a seatbelt is irresponsible, good luck with the lawsuit.
It doesn’t sound like he was relying solely on autopilot - it sounds like he was cut off on a highway and reacted at the last second, same as the car.
Also all new luxury cars automatically brake in these situations, nothing to do with some fancy autopilot - do Teslas not do this? Seems like the “autopilot” thing is just something the media gins up for clicks.