The National Transportation Safety Board says the driver of a Tesla SUV who died in a Silicon Valley crash two years ago was playing a video game on his smartphone while his vehicle was being controlled by a partially automated driving system.
Chairman Robert Sumwalt said at the start of a hearing Tuesday that systems like Tesla’s Autopilot cannot drive themselves, yet drivers continue to use them without paying attention.
“If you own a car with partial automation, you do not own a self-driving car,” Sumwalt said in opening statements. “This means that when driving in the supposed ‘self-driving’ mode, you can’t read a book, you can’t watch a movie or TV show, you can’t text and you can’t play video games.”
The board will determine a cause of the crash at the hearing and make recommendations to prevent it from happening again. Sumwalt says government regulators have ignored the board’s previous recommendations for measures to prevent these crashes.
The March 2018 crash involving a Tesla Model X SUV killed Apple engineer Walter Huang when it swerved and slammed into a concrete barrier dividing freeway and exit lanes in Mountain View, Calif.
Just before the crash, the Tesla steered to the left into a paved area between the freeway travel lanes and an exit ramp, the NTSB said. It accelerated to 71 mph and crashed into the end of the concrete barrier. The car’s forward collision avoidance system didn’t alert Huang, and its automatic emergency braking did not activate, the NTSB said.
Also, Huang did not brake, and there was no steering movement detected to avoid the crash, the board’s staff said.
NTSB investigators previously found that Tesla’s system became confused at a freeway exit and was a factor in the crash. Documents released earlier this month quoted Huang’s relatives as saying Huang had previously complained about Autopilot malfunctioning and swerving in the area near where crash occurred.
Autopilot is designed to keep a vehicle in its lane and keep a safe distance from vehicles in front of it. It also can change lanes with driver approval. Tesla says Autopilot is intended to be used for driver assistance and that drivers must be ready to intervene at all times.
NTSB staff determined that Tesla’s system does not adequately make sure drivers are paying attention and recommended that stronger driver monitoring systems be required. Sumwalt said the board had made recommendations to six automakers in 2017 to stop the problem and only Tesla has failed to respond.
Teslas can sense a driver applying force to the steering wheel, and if that doesn’t happen, it will issue visual and audio warnings. But monitoring steering wheel torque, “is a poor surrogate measure” of monitoring the driver, Ensar Becic, the NTSB’s human performance and automation highway safety expert told the board.
Messages were left Tuesday afternoon seeking comment from Tesla.
Under questioning from board members, Robert Molloy, the NTSB’s director of highway safety, said the National Highway Traffic Safety Administration is taking a hands-off approach to regulating new automated driving systems like Autopilot. Molloy called the approach “misguided,” and said nothing is more disappointing than seeing recommendations ignored by Tesla and NHTSA.
The NTSB can only make recommendations, while NHTSA is the agency charged with making regulations.
“They need to do more,” he said of the federal highway safety agency.
NHTSA has told the NTSB it has investigations open into 14 Tesla crashes and would use its enforcement of safety defects to take action if needed.
The agency issued a statement saying it will review the NTSB’s report and that all commercially available vehicles require human drivers to stay in control at all times.
“Distraction-affected crashes are a major concern, including those involving advanced driver assistance features,” the statement said.
Sumwalt said the NTSB had called for technology more than nine years ago to disable distracting functions of smartphones while the user is driving, but no action has been taken.
Don Karol, the NTSB’s project manager for highway safety, told the board that the staff is recommending that cell phone companies program phones to automatically lock out distracting functions such as games and phone calls while someone is driving. The staff also recommends that companies enact policies to prevent use of company issued cell phones while workers are driving.
“Lockout mechanisms should be default setting and should automatically lock out distracting functions,” Karol told the board.
Tesla has said Autopilot was put out initially in “beta,” meaning it was being tested and improved as bugs were identified, Karol told the board.
That brought a response from Vice Chairman Bruce Landsburg, who said if the system has known bugs, “it’s probably pretty foreseeable that somebody’s going to have a problem with it. And then they (Tesla) come back and say ‘oh, we warned you.’”
NTSB: Driver in fatal Tesla crash was playing video game