Autopilot? NHTSA To Investigate Fatal Tesla Crash After It Ran A Red Light

Autopilot? NHTSA To Investigate Fatal Tesla Crash After It Ran A Red Light
The fatal Dec. 29 crash of a Tesla Inc. vehicle in Southern California will be investigated by the U.S. government's auto safety regulator, the agency said on Tuesday.

NHTSA said earlier this month it had opened an investigation into a 12th Tesla crash that may be tied to the vehicle’s advanced Autopilot driver assistance system after a Tesla Model 3 rear-ended a parked police car in Connecticut.

NHTSA did not say if Autopilot was suspected in Sunday's crash in Gardena in Los Angeles County.


Read Article

MDarringerMDarringer - 1/2/2020 10:24:57 AM
-3 Boost
We really need to get Teslas in the hands of terrorists.


MDarringerMDarringer - 1/2/2020 11:19:36 AM
+3 Boost
agree


SanJoseDriverSanJoseDriver - 1/2/2020 2:23:13 PM
-1 Boost
In every time it was found the user was not using the system correctly. Statistically you're still safer using Autopilot, especially if you're paying attention.


SanJoseDriverSanJoseDriver - 1/3/2020 4:23:57 PM
0 Boost
It doesn't allow a driver to fall asleep, every 30 seconds you have to tug on the wheel unless you use an illegal device to override that. Within 60 seconds the car stops and puts its hazards on if it doesn't detect hands. Some of the videos were staged and others were people using cheat devices.


SanJoseDriverSanJoseDriver - 1/2/2020 2:20:41 PM
0 Boost
A red light was run before the accident, Autopilot doesn't stop for lights as it's not yet supposed to be used on city streets (yet). Likely it will be no fault just like all of the other investigations.


PUGPROUDPUGPROUD - 1/2/2020 6:11:58 PM
+2 Boost
Blame it on victim is a winning marketing strategy!


SanJoseDriverSanJoseDriver - 1/2/2020 11:08:48 PM
+1 Boost
Here is a fair comparison of Autopilot versus BMW's best efforts on their most expensive car:
https://www.youtube.com/watch?time_continue=444&v=ceziEp4oEFA&feature=emb_logo


SanJoseDriverSanJoseDriver - 1/3/2020 4:25:13 PM
+1 Boost
Nah, the drivers are still responsible. Only Waymo is offering a fully autonomous system right now. Tesla still has a ways to go but it's obvious they are way ahead of the curve for cars you can buy.


valhallakeyvalhallakey - 1/2/2020 8:17:28 PM
+2 Boost
The driver is fully responsible, ultimately responsible and must pay attention. All of these type accidents are the result of driver error. If you have your car on cruise control and the highway ends is the cruise control systems fault??? Of course not... people need to quite blaming others (even yes Tesla) for what can only be described as driver error. I would react differently if the autopilot made it impossible for the driver to have any input or to correct errors the autopilot is making. This is very much like an airplane. If it is on autopilot and behaves incorrectly it is up to the pilot to be paying attention and correct any errors.


SanJoseDriverSanJoseDriver - 1/4/2020 10:16:57 PM
+1 Boost
Which is the truth and lives are being saved by this existing at all even though it is far from perfect.


SanJoseDriverSanJoseDriver - 1/5/2020 7:31:47 PM
0 Boost
For every horror story, there are many more stories like these where Autopilot actually prevented an accident, even when others on the road would be at fault:

https://insideevs.com/news/391075/video-autopilot-prevents-tesla-crash/


MDarringerMDarringer - 1/5/2020 7:52:29 PM
+1 Boost
That's like saying for all the heroin-shooting whores with STDs there are monogamous women who are not guilty of moral gaffes. The latter does not erase the former. #LogicalFallacy


SanJoseDriverSanJoseDriver - 1/7/2020 3:59:06 AM
+1 Boost
The most logical argument is statistics, if you're less likely to die in a car with Autopilot you're effectively killing people if you were to ban it.


Copyright 2026 AutoSpies.com, LLC