DMV 'revisiting' its approach to regulating Tesla's public self-driving test

In This Article:

FILE - This Friday, Sept. 30, 2016, file photo shows the logo of the Tesla Model S on display at the Paris Auto Show in Paris. A Minnesota man is blaming Tesla's partially self-driving Autopilot system for a crash on Saturday, July 15, 2017, in Hawick, Minn. Motorist David Clark told deputies that when he engaged the Autopilot feature, the car suddenly accelerated, left the roadway and overturned in a marsh. Clark and his passengers sustained minor injuries. Tesla said it's investigating and will cooperate with local authorities. (AP Photo/Christophe Ena, File)
California's Department of Motor Vehicles has taken a different approach to regulating tests of Tesla's self-driving software than it has to automakers testing similar systems. That may be about to change. (Christophe Ena / Associated Press)

For years, Tesla has tested autonomous vehicle technology on public roads without reporting crashes and system failures to the California Department of Motor Vehicles, as other robot car developers are required to do under DMV regulations.

But confronted with dozens of viral videos showing Tesla’s Full Self-Driving beta technology driving the car into dangerous situations, and a letter of concern from a key state legislator, the DMV now says it’s reviewing Tesla’s behavior and reassessing its own policies.

The agency informed Tesla on Jan. 5 that it is “revisiting” its opinion that the company’s test program doesn’t fall under the department’s autonomous vehicle regulations because it requires a human driver.

“Recent software updates, videos showing dangerous use of that technology, open investigations by the National Highway Traffic Safety Administration, and the opinions of other experts in this space” prompted the reevaluation, the DMV said in a letter Monday to state Sen. Lena Gonzalez (D-Long Beach), chair of the Senate's transportation committee.

Concerned about public safety, Gonzalez asked the DMV in December for its take on Tesla’s Full Self-Driving beta program, under which Tesla owners supervise the operation of cars programmed to autonomously navigate highways, city streets and neighborhood roads, stopping at traffic lights and stop signs as well as making left and right turns into traffic.

Those are the same features being tested by other robot car developers that report crashes and disengagements to the DMV, a group that includes Waymo, Cruise, Argo and Zoox. Although their cars occasionally crash, there are few YouTube videos that show them behaving dangerously.

Unlike the other companies, Tesla is doing without trained test drivers. Participants in the Full Self-Driving beta have paid $10,000 for the privilege — soon to be raised to $12,000.

If the DMV requires Tesla to conform to DMV autonomous testing safety regulations, the company would have to report crashes and system failures, giving the public hard data needed to evaluate how safe or how dangerous the technology is. It would also stiffen test-driver requirements.

Thus far, the DMV has not required Tesla to report crashes and disengagements — situations in which the robot software turns control over to the test driver. The agency has said it considers Full Self-Driving a “Level 2” driver assist system, akin to systems from other carmakers that include lane-keeping, adaptive cruise control, and automatic lane changing.

Tesla Full Self-Driving beta vehicles are all over YouTube and Twitter driving into oncoming traffic, choosing to drive on railroad tracks instead of a street and aiming themselves into metal posts and traffic barriers. A Tesla on Autopilot mistook the moon for a yellow light. Tesla recently added a feature to the technology called "assertive" mode that allows "rolling stops" at stop signs.