California has informed Tesla that it is considering tighter regulation of the electric car maker’s driver assistance tools that are currently being tested on public roads after videos of worrying episodes were posted online.
Several clips on YouTube and Twitter show that drivers are testing the beta version of Full Self-Driving and suddenly have to regain control of their vehicles in order to prevent their Tesla from crashing into a mast or into the oncoming lane.
Tesla has determined that the tools require active driver supervision, but California’s Department of Motor Vehicles said in a Jan. 5 letter to the company that it is reviewing that the features meet the definition of an autonomous vehicle.
Elon Musk’s auto company has recruited some drivers for real-condition tests for FSD Beta, which is said to be able to drive in the city, stop automatically or turn.
The Californian DMV wrote in its letter that it is reconsidering its “classification decision following recent software updates, videos showing the dangerous use of this technology and open investigations” by US regulators.
“DMV will initiate a further review of the latest versions, including enhancements to the program and features,” the letter said.
Should the DMV decide to classify Tesla’s driver assistance systems as an autonomous vehicle, the rules will be stricter.
For example, Tesla would have to report problems to the authorities and identify all drivers who are testing its new tools.
The company did not respond to a request for comment.
Check out the latest from the Consumer Electronics Show on Gadgets 360 in our CES 2022 hub.