US auto safety regulators said Monday they had opened a formal safety probe into Tesla Inc’s driver assistance system Autopilot after a series of crashes involving emergency vehicles.
The National Highway Traffic Safety Administration (NHTSA) said that since January 2018 it had identified 11 crashes in which Tesla models “have encountered first responder scenes and subsequently struck one or more vehicles involved with those scenes.”
After investigating, NHTSA could opt to take no action, or it could demand a recall, which might effectively impose limits on how, when and where Autopilot operates. Any restrictions could narrow the competitive gap between Tesla’s system and similar advanced driver assistance systems offered by established automakers.
The auto safety agency said it had reports of 17 injuries and one death in those crashes.
Tesla shares were down 3.6% on the investigation.
The company did not immediately respond to a request for comment. Chief Executive Elon Musk has repeatedly defended Autopilot and in April tweeted that “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”
NHTSA said the 11 crashes included four this year, most recently one last month in San Diego, and it had opened a preliminary evaluation of Autopilot in 2014-2021 Tesla Models Y, X, S, and 3.
“The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes,” NHTSA said in a document opening the investigation.
The probe covers an estimated 765,000 Tesla vehicles in the United States, NHTSA said in opening the investigation.
NHTSA has in recent years sent numerous special crash investigation teams to review a series of Tesla crashes.
It said most of the 11 crashes took place after dark and the crash scenes encountered included measures like emergency vehicle lights, flares or road cones.
NHTSA said its investigation “will assess the technologies and methods used to monitor, assist, and enforce the driver’s engagement with the dynamic driving task during Autopilot operation.”
Before NHTSA could demand a recall, it must first decide to upgrade a preliminary investigation into an engineering analysis. The two-step investigative process often takes a year or more.
Autopilot, which handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods, was operating in at least three Tesla vehicles involved in fatal U.S. crashes since 2016, the National Transportation Safety Board (NTSB) has said.
The NTSB has criticized Tesla’s lack of system safeguards for Autopilot and NHTSA’s failure to ensure the safety of Autopilot.
In February 2020, Tesla’s director of autonomous driving technology, Andrej Karpathy, identified a challenge for its Autopilot system: how to recognize when a parked police car’s emergency flashing lights are turned on.
“This is an example of a new task we would like to know about,” Karpathy said at a conference.
In one of the cases, a doctor was watching a movie on a phone when his vehicle rammed into a state trooper in North Carolina.
Bryant Walker Smith, a law professor at the University of South Carolina, said the parked emergency crashes “really seems to illustrate in vivid and even tragic fashion some of the key concerns with Tesla’s system.” He said it induces driver complacency and is not working in some non-typical circumstances.
NHTSA, he suggested, “has been far too deferential and timid, particularly with respect to Tesla.”
One of the 11 crashes NHTSA cited was a January 2018 crash into a parked fire truck in California. NTSB said the system’s design “permitted the driver to disengage from the driving task,” in the Culver City, California, crash.
NHTSA said Monday it had sent teams to review 31 Tesla crashes involving 10 deaths since 2016 where it suspected advanced driver assistance systems were in use. It ruled out the systems in three of the crashes.
In a statement, NHTSA reminded drivers “no commercially available motor vehicles today are capable of driving themselves … Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly.”
Tesla and CEO Musk have sparred with U.S. agencies over the years on various safety issues.
In February, Tesla agreed to recall 134,951 Model S and Model X vehicles with touchscreen displays that could fail and raise the risk of a crash after U.S. auto safety regulators sought the recall.
NHTSA made a rare formal recall request to Tesla in January and said other automakers issued numerous recalls for similar safety issues stemming from the touchscreen failure.
Musk said last month on Twitter the automaker will hold “Tesla AI Day” on Thursday to “go over progress with Tesla AI software & hardware, both training & inference. Purpose is recruiting.”
In January 2017, NHTSA closed a preliminary evaluation into Autopilot covering 43,000 vehicles without taking any action after a nearly seven-month investigation.
NHTSA said at the time it “did not identify any defects in the design or performance” of Autopilot, “nor any incidents in which the systems did not perform as designed.”
NHTSA has not had a Senate-confirmed administrator since January 2017 and nearly seven months into office President Joe Biden has not nominated anyone for the post.
Originally Appeared Here