Las Vegas Sun

June 17, 2024

Undeterred, Tesla defends its self-driving technology

Elon Musk

Marcio Jose Sanchez / AP

Elon Musk, CEO of Tesla Motors, introduces the Model X car at the company’s headquarters Tuesday, Sept. 29, 2015, in Fremont, Calif.

Even as federal safety officials step up their investigation of the fatal crash of a driver operating a Tesla car with its Autopilot system engaged, the company continues to defend the self-driving technology as safe when properly used.

The National Highway Traffic Safety Administration on Tuesday released a detailed set of questions for the carmaker about its automated driving system, particularly the emergency braking function.

The Autopilot system has been the subject of a federal investigation since regulators revealed in late June that the driver of a Tesla Model S sedan, Joshua Brown, was killed on May 7 when his vehicle crashed into a tractor-trailer in Florida.

The Autopilot feature was engaged at the time, Tesla has said, but neither the automatic braking system nor Brown applied the brakes before the car hit the trailer at 65 mph.

Despite that acknowledgment by the company, as the federal agency pushes for answers about the accident and whether the Autopilot system failed to work properly, Tesla officials continue to say that the technology is safe. They also say they have no plans to disable the feature, which is installed in about 70,000 Tesla cars on the road. Instead, they indicate that drivers may be to blame for misusing Autopilot.

In an interview, a Tesla executive said the company believed that the system was safe as designed, but that consumers needed to realize that misusing Autopilot “could mean the difference between life and death.”

The executive, whom Tesla authorized to speak only if he was not named, said drivers needed to be aware of road conditions and be able to take control of the car at a moment’s notice — even though he said Autopilot’s self-steering and speed controls could operate for up to three minutes without any driver involvement.

“With any driver assistance system, a lack of customer education is a very real risk,” the executive said.

Tesla, whose massive Northern Nevada battery factory will churn out batteries for the company's electric cars, has been widely criticized for introducing Autopilot in “beta” mode, which usually signifies a technology that is still under development and not completely tested. And some engineers researching self-driving cars have concluded that automated systems that rely on the driver’s suddenly resuming command cannot be made fully safe.

But the Tesla executive said the Autopilot system had performed safely during tens of millions of miles of driving by consumers.

“It’s not like we are starting to test this using our customers as guinea pigs,” he said.

Elon Musk, Tesla’s chief executive, said in a Wall Street Journal interview published Tuesday that the company planned a blog post to educate Tesla owners on how to use the system safely.

“A lot of people don’t understand what it is and how you turn it on,” he said.

The questions raised by NHTSA, in a nine-page letter that was dated July 8 but not made public until Tuesday, indicated the agency was investigating whether there are defects in the various crash-prevention systems related to Autopilot.

Those systems include automatic emergency braking, which is supposed to stop Tesla models from running into other vehicles detected by radar and a camera. Another is Autosteer, which uses radar and cameras to guide the vehicles on highways or in slow-moving traffic.

The attention generated by the fatal Tesla accident has stoked a public debate about the safety and wisdom of making “self-driving cars.” But in reality — YouTube videos of daredevil Tesla drivers notwithstanding — the current generation of technology involves driver-assistance features mainly meant to make vehicles safer and help avoid accidents.

In the future, fully self-driving cars have a huge potential to save lives, but the current driver-assist cars can present major risks, said David Teater, a prominent traffic safety advocate. The problem, Teater said, is that the technology sends a mixed message to drivers, suggesting they can disengage but also requiring them to take over at a moment’s notice.

It is not an issue unique to Tesla, but one that potentially pertains to any carmaker that is ‘'betting the farm on driver-assist technology that requires the driver to remain engaged,” he said.

“Would you want your kid jumping into a Tesla and turning on driving assist after what’s just happened?” Teater asked. Right now the technology, he said, “is more about marketing than safety.”

Jeff Larason, the Massachusetts highway safety director, is among those in the auto safety community who argue that automated and self-driving technologies will ultimately greatly reduce the toll from driving. Human error accounts for more than 90 percent of all car crashes, he said.

Larason said the Tesla fatality, while a “horrible tragedy,” should not undermine the good that accident-avoidance technology might achieve.

“When it happened,” he said, referring to Florida fatality, “my response was, ‘Here comes the I-told-you-so, knee-jerk reaction.'”

If autonomous driving technology hits a roadblock, he said, “it won’t be because of technology, but because of legislation and because the public doesn’t have the will to make it happen.”

While Tesla has been the focus of the debate, many cars, including models from BMW, Mercedes-Benz and Volvo, have systems that use a combination of adaptive cruise control, lane-keeping and automatic braking.

While the systems enable drivers to briefly take their hands off the wheel, they are mainly designed to help drivers avoid accidents.

But the issues raised by the Tesla fatality, and the questions NHTSA is asking, involve whether Autopilot’s potential safety advantages offset the risks that such technology lulls drivers into a false sense of security.

Tesla remotely collects vast volumes of information about its cars as they are being driven. The federal safety agency apparently wants the company to thoroughly mine that data.

The agency’s letter asks Tesla to provide a wide range of information, including a list of all vehicles sold in the United States that are equipped with the Autopilot system. It also asks how many miles have been driven with Autosteer activated, and how many times the automatic system warned drivers to put their hands back on the steering wheel.

NHTSA also wants to know, among other things, the number of incidents in which Tesla vehicles’ automatic emergency braking was activated. The letter asks Tesla to turn over any information on consumer complaints or reports of crashes, or other incidents in which a vehicle’s accident-prevention systems may not have worked properly.

The agency is also seeking detailed data about how Tesla’s technology detects pedestrians, bicycles and other vehicles, including vehicles moving laterally across a road.

The tractor-trailer involved in the Florida crash made a left turn across the path of Brown’s Tesla, which the accident investigation report indicated did not slow before hitting the trailer and then continuing under it at high speed before eventually running off the road and striking a utility pole that brought the car to a stop.

Tesla has said that in Brown’s accident, the Autopilot system failed to see the white truck against a bright sky.

Join the Discussion:

Check this out for a full explanation of our conversion to the LiveFyre commenting system and instructions on how to sign up for an account.

Full comments policy