US regulator opens investigation into Tesla’s Autopilot system after crashes into emergency vehicles
The US auto safety regulator has launched a major inquiry involving some 765,000 Tesla cars over the ‘Autopilot’ driver assistance system after a series of crashes into emergency vehicles.
The formal inquiry will “assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” the National Highway Traffic Safety Administration (NHTSA) announced on Monday.
If the investigation finds the system to be flawed, it could result in the recall of Tesla’s vehicles or other penalties from the agency.Also on rt.com Tesla admits Elon Musk is exaggerating about ‘full self-driving’ cars before end of year – reports
The regulator said that since 2018 it had identified 11 traffic incidents – four of them this year – in which various Tesla models “encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.” One person was killed and 17 others injured in these incidents, it added.
The electric cars involved in those collisions were “all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control” at that time, the NHTSA pointed out.
Most incidents occurred at nighttime, but crash scenes were properly equipped with “flares, an illuminated arrow board, and road cones,” and the affected emergency vehicles had their first responder lights switched on, the agency added.
The inquiry covers Tesla cars released in the US since the beginning of 2014 – some 765,000 vehicles across the company’s Y, X, S and 3 models.Also on rt.com Elon Musk draws flak after claiming data logs clear Tesla’s ‘Autopilot’ of blame in fatal ‘driverless’ vehicle crash
Tesla Inc. hasn’t yet commented on the investigation. The firm’s Autopilot feature allows drivers to take their hands off the wheel for extended periods of time while the vehicle is still in motion. However, the manufacturer insists that the driver must be ready to intervene and take manual control of the car at any time.
A number of drivers, however, have misused the system, with some caught driving drunk, asleep, or even sitting in the back of their self-driving cars, as one did in the Bay Area of California earlier this year. At least three Tesla vehicles on Autopilot have been involved in fatal crashes since 2016, the NHTSA said.
The regulator, which previously investigated separate crashes involving Tesla cars, has already criticized the company over the lack of safeguards for its Autopilot. It advised Tesla to limit the use of the system to certain areas and install better mechanisms to make sure the person behind the wheel was paying attention to the road.
Think your friends would be interested? Share this story!