Tesla's Updated Autopilot Still Not Safe Enough, Consumer Reports Says

Tesla's Updated Autopilot Still Not Safe Enough, Consumer Reports Says

Tesla was forced to recall nearly two million cars in the U.S. in December after the National Highway Traffic Safety Administration determined that its Autopilot driver assistance system could be abused too easily. Right out of the gate, experts were skeptical that Tesla would actually fix the problems the NHTSA had with Autopilot, and when Consumer Reports got its hand on the updated version of Autopilot, wouldn’t you know it, the system was still abused far too easily.

People Are Relying on ADAS to Do Things it Can’t Do

New warnings and driver notifications may be more visible than they were before, but by putting them on the center screen, Consumer Reports found that they actually take drivers’ attention off of the road. Also, while the strikes policy that Tesla introduced for its so-called Full Self-Driving beta feature now applies to Autopilot use, as well, Consumer Reports found that it isn’t reliable enough to encourage safer driving. Also, when drivers activate adaptive cruise control, it also activates lane centering, which wouldn’t be a problem except that if a driver has to, say, turn the wheel to avoid a pothole, it also turns off adaptive cruise control.

The most concerning issue, however, is the ineffective driver monitoring system that Tesla still uses:

Autopilot still lacks an effective direct driver-monitoring system (DDMS), which we believe is essential to the safe operation of ADA systems. Although the in-cabin camera is capable of determining if the driver is looking away, the camera can be covered entirely and Autopilot will still work on all road types without warning the driver that the camera is blocked.

See also  Why clients are missing insurance payments — and risk policy cancellation

Additionally, when we covered the camera and kept one hand resting on the steering wheel, the vehicle did not limit Autopilot use or give any warnings to pay attention. According to [Kelly Funkhouser, associate director of vehicle technology at CR’s auto test center], the driver could be asleep or completely distracted and the car wouldn’t warn them as long as they are holding the wheel.

In a statement, William Wallace, CR’s associate director of safety policy, called on the NHTSA to require Tesla to make more significant changes to its Autopilot software that actually improve safety. “In light of Consumer Reports’ findings, NHTSA should immediately revisit this recall and require Tesla to take stronger steps to protect people’s safety,” Wallace said. “NHTSA shouldn’t wait for any more crashes. It should take action now to make this recall better.”