Tesla faces tough questions after fatal highway crash
Tesla is facing questions about the safety of its driving system after one of its vehicles was involved in a fatal crash in California last week. It’s understood the driver who died had raised concerns over the autopilot system.
The Tesla Model X struck a concrete divider on a highway in Mountain View, California on March 23. The car caught fire in the incident, causing two other cars to collide with its rear end. The driver of the Tesla, Apple engineer Wei Huang, died in the crash.
In a statement published to the company’s website late on Friday night, Tesla said that its autopilot feature was engaged at the time of the incident, but vehicle logs showed the driver had taken no action despite receiving earlier warnings to place his hands on the wheel. Two separate safety investigations have been launched into the incident amid reports that the victim had made several complaints to Tesla about the autopilot system prior to the crash.
The incident is the latest in a number of hazardous collisions involving Tesla vehicles in recent times. On Friday, Tesla announced that it was voluntarily recalling 123,000 Model S vehicles due to problems with the power-steering component.
In January, a Model S collided into the rear of a fire engine while at the scene of a separate accident in Culver City, Los Angeles. The driver reportedly told firefighters that the car had been traveling at around 65mph (105kph) at the time of the collision, and the vehicle’s autopilot had been engaged. The Culver City Fire Department posted an image to Instagram showing the extent of the damage.
A Tesla plowed into the rear of Engine 42 earlier this morning while crews were on the 405 freeway for a motorcycle down. Amazingly there were no injuries! Please stay alert while driving! #ccfd #culvercityfire #culvercityfiredepartment #culvercityfirefighters #culvercity #heartofscreenland #abc7eyewitness #fox11news #ktla5news
This followed the death of Joshua Brown, a driver of a Tesla Model S, in Florida in May 2016. The autopilot feature came in for scrutiny again after the company admitted that its software had failed to identify the white side of a tractor-trailer as it moved in front of the car. However, a subsequent report by the National Transportation Safety Board (NTSB) found that Brown had ignored warnings from the car’s driving system to keep his hands on the wheel.
According to an investigation conducted by The Guardian last year, Tesla often shares car data with the media following crashes but has never provided the logs to the drivers themselves. The paper cited a number of instances in which drivers claimed that the car had malfunctioned and caused an accident, only for Tesla to release their data to the press in an attempt to exonerate itself of blame – an action some drivers claim is a breach of their privacy rights.
In one such example, a Tesla X driver in Montana veered off the road and hit a guardrail while driving with the Autopilot Assist feature activated. A Tesla spokesperson told Fortune magazine that “the data suggests that the driver’s hands were not on the steering wheel, as no force was detected on the steering wheel for over two minutes after auto-steer was engaged.”
According to the Department of Motor Vehicles, there have been 60 reported collisions involving autonomous vehicles in California since 2014. However, the majority of these incidents have been put down to human error. Across the US, there have been further isolated cases but the safety of the vehicles cannot be accurately assessed without further testing.
Tesla compared the number of fatalities involving its cars to those of other manufacturers. “[In the US] there is one automotive fatality every 86 million miles across all vehicles from all manufacturers,” the company said in a statement. “For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware.”
However, a study by the Rand Corporation points out that autonomous cars will need to be on the road a lot longer before their reliability can be measured. “Autonomous vehicles would have to be driven hundreds of millions of miles, and sometimes hundreds of billions of miles, to demonstrate their reliability in terms of fatalities and injuries,” the study read.
If you like this story, share it with a friend!