icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
19 Apr, 2021 15:36

Two killed in ‘driverless’ Tesla, possibly using ‘Autopilot’ feature, as Elon Musk touts company's safety record

Two killed in ‘driverless’ Tesla, possibly using ‘Autopilot’ feature, as Elon Musk touts company's safety record

In another potential ‘Autopilot’-related accident, two people were killed in a fiery crash in Texas with no-one in the driver’s seat of the Tesla vehicle – on the same day the carmaker touted its vehicle safety record.

The fully-electric 2019 Tesla Model S had sped off the road after failing to make a turn on Saturday. It crashed into a tree north of Houston and burst into flames with the passengers trapped inside.

Local news station KPRC-2 reported that a preliminary police investigation suggested that “no one was driving” at the time of impact. One man was sitting in the front passenger seat and the other was in the back.

Also on rt.com Elon Musk says he wants new Tesla Roadsters to fly ‘without, you know, killing people’

First responders told the station that it took four hours and 32,000 gallons of water to extinguish the flames because the vehicle’s batteries kept reigniting. Authorities even had to call Tesla at one point because firefighters didn’t know how to stop it. The Model S has been involved in a series of fires in recent years.

As the investigation is ongoing, it is still unclear whether the car’s autopilot mode was activated. A police source told the New York Times that the wives of the victims had heard them talking about the feature just prior to setting out for the ride.

There has been no official statement as yet from Tesla, which disbanded its press relations team and typically does not respond to media inquiries.

Earlier in the day, however, Elon Musk, Tesla’s chief executive, had shared the company’s first quarterly vehicle safety report for 2021, tweeting, “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”

In the wake of the accident, the tweet has attracted a mix of support and criticism, with a number of users blaming the accident on human error. Others note that Tesla was to blame for “misleading the public” by advertising “hands-free driving”.

Although Musk has already outlined a plan for “fully self-driving” cars to be made available this year, there are roadblocks to negotiate. Last month, the US National Highway Traffic Safety Administration opened probes into some 27 crashes involving Tesla vehicles.

Tesla has also been under fire for marketing its automated driving systems under the tag ‘Autopilot’, which critics say is a misleading term that encourages such dangerous behavior as sleeping at the wheel. Videos of this and other reckless behavior are available online.

Germany has already banned Tesla from using such terms as “autopilot inclusive” and “full potential for autonomous driving” in advertising materials.

The carmaker, whose website describes the autopilot mode as the “future of driving”, states that the feature allows vehicles to “steer, accelerate and brake automatically within its lane.” However, it also cautions that “current Autopilot features require active driver supervision and do not make the vehicle autonomous.”

In the past, Musk has vetoed calls from his own engineers to increase safety monitoring measures like cameras and additional sensors for when a vehicle is in autopilot mode, saying they were “ineffective.”

During a May 2018 call with investors, he had said, “When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency.”

“They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do,” he added.

Think your friends would be interested? Share this story!

Podcasts
0:00
25:36
0:00
25:12