For years, Tesla has tested autonomous vehicle technology on public roads without reporting crashes and system failures to the California Department of Motor Vehicles, as other robot car developers are required to do under DMV regulations.
But confronted with dozens of viral videos showing Tesla’s Full Self-Driving beta technology driving the car into dangerous situations, and a letter of concern from a key state legislator, the DMV now says it’s reviewing Tesla’s behavior and reassessing its own policies.
The agency informed Tesla on Jan. 5 that it is “revisiting” its opinion that the company’s test program doesn’t fall under the department’s autonomous vehicle regulations because it requires a human driver.
“Recent software updates, videos showing dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in this space” prompted the reevaluation, the DMV said in a letter Monday to state Sen. Lena Gonzalez (D-Long Beach), chair of the Senate's transportation committee.
Concerned about public safety, Gonzalez asked the DMV in December for its take on Tesla’s Full Self-Driving beta program, under which Tesla owners supervise the operation of cars programmed to autonomously navigate highways, city streets and neighborhood roads, stopping at traffic lights and stop signs as well as making left and right turns into traffic.
Those are the same features being tested by other robot car developers that report crashes and disengagements to the DMV, a group that includes Waymo, Cruise, Argo and Zoox. Although their cars occasionally crash, there are few YouTube videos that show them behaving dangerously.
Unlike the other companies, Tesla is doing without trained test drivers. Participants in the Full Self-Driving beta have paid $10,000 for the privilege — soon to be raised to $12,000.
If the DMV requires Tesla to conform to DMV autonomous testing safety regulations, the company would have to report crashes and system failures, giving the public hard data needed to evaluate how safe or how dangerous the technology is. It would also stiffen test-driver requirements.
Thus far, the DMV has not required Tesla to report crashes and disengagements — situations in which the robot software turns control over to the test driver. The agency has said it considers Full Self-Driving a “Level 2” driver assist system, akin to systems from other carmakers that include lane-keeping, adaptive cruise control, and automatic lane changing.
In her December letter, Gonzalez asked the DMV for its assessment of the ongoing Full Self-Driving beta trial and whether it posed a public danger.
In its response, the DMV cited a Full Self-Driving beta demonstration conducted more than a year ago, in Nov. 2020: “The vehicle could not safely complete the entire task of driving on its own" and was unable “to recognize or respond to ‘static objects, road debris, emergency vehicles, construction zones, large uncontrolled intersections, adverse weather, complicated vehicles in the driving path, and unmapped roads.’”
Gonzalez's office said the senator is reviewing the DMV’s response letter and will speak with The Times in the near future.
The Times for months has requested an interview with DMV head Steve Gordon, but he has consistently refused, as he did, through a spokeswoman, on Tuesday.
The agency has in the past pointed to California state law in defense of its current approach. California’s laws on autonomous vehicle technology use definitions derived from a document published by the Society of Automotive Engineers, which breaks down vehicle automation into six levels, from Level 0 to Level 5.
The DMV has said it considered Full Self-Driving to be Level 2, because, according to Tesla, it requires a human driver to assure safety. But so do test cars from the other robotaxi companies developing Level 4 vehicles, said Phil Koopman, an engineering professor at Carnegie Mellon University who has assisted the SAE on its standard-setting documents.
“The DMV concludes that FSD is not an automated vehicle because a human driver must monitor to intervene. That is a description that fits any AV test vehicle with a safety driver, which FSD is,” Koopman said in an email to The Times.
Trained safety drivers, he notes, are expected to stop robot cars from making errors that could endanger other road users. "Those driving errors include traffic law violations such as running red lights and running stop signs which should have been stopped by a responsible test driver but were not," he said. A recent YouTube video shows a woman driving an FSD beta vehicle allowing the car to run a red light.
He added that the DMV would find it hard to answer Gonzalez's question about public danger because it lacks data — the kind of data the DMV collects from other robot-car developers. The DMV “does not actually address this question, perhaps because they have no testing data for FSD. But the reason they have no data is that they have let Tesla get away with claiming Level 2, thus evading regulatory oversight.”
In its letter to Gonzalez, the agency referred to the deficiencies in FSD beta technology to justify its Level 2 designation, including its inability to recognize basic objects such as road debris, or identify emergency vehicles.
“Legal merits aside, the [claim that] the technology was so bad it couldn’t possibly be autonomous is just not a reassuring statement,” said Bryant Walker Smith, autonomous vehicle law specialist at the University of South Carolina.
Koopman noted that the DMV response letter does not mention a crucial element in the SAE standards document, known as J3016 and included as a defining document in California autonomous vehicle law. That is “design intent.” The SAE document says it is “incorrect” to designate a Level 4 design feature as Level 2 “simply because on-road testing requires a test driver to supervise the feature while engaged, and to intervene if necessary to maintain operation.”
Tesla Chief Executive Elon Musk has made his design intent clear. For years, he’s been promising that Full Self-Driving technology will produce autonomous robotaxis that owners can rent out when they’re not using them to make extra money. The promises have not come close to being fulfilled, and many driverless experts think the YouTube videos show just how far away true Full Self-Driving really is.
The DMV offered no details on what its “revisit” would entail, or how long it will take to reach a conclusion.
DMV regulations bar a company from selling a technology as autonomous when it’s not. The agency announced a “review” of Tesla’s possible violation of that regulation last May. Eight months later, the review is “ongoing,” according to the DMV.