Tesla’s Fully Autonomous Driving Regulatory Process To Be Further Reviewed by California DMV
The California Department of Motor Vehicles is examining the regulatory process for Tesla’s Full Self-Drive Beta Program in more detail to determine if it could be subject to the autonomous vehicle regulations of the DMV. The regulations apply to other automakers trying to develop self-driving vehicles, including Waymo, Cruise, and Zoox.
According to a report by the Los Angeles Times, the California DMV is examining whether Tesla’s Complete Self-Drive Beta program should be included in the group of companies that are testing their products on public roads. Tesla has been excluded from the grouping previously because the functionality requires a human driver. Tesla maintains that its cars are not fully autonomous and that humans need to be alert at all times.
However, on Jan. 5, the California DMV told Tesla it would “reconsider” the automaker’s potential inclusion in the autonomous vehicle testing regulatory program. This would require Tesla to report all system failures and crashes to the California DMV, which other companies in the program are already required to do. The potential inclusion seems to indicate that there are several reasons for Tesla’s potential inclusion in the regulations, including NHTSA investigations.
“Recent software updates, videos showing unsafe use of this technology, investigations initiated by the National Highway Traffic Safety Administration and the opinions of other experts in this space have prompted the re-evaluation,” the DMV said in a statement. letter to Lena Gonzalez, a senator from Long Beach, California.
The beta program is limited and Tesla only allows drivers with a safety score of 98 or higher to receive exclusive updates. Even still, some of the users with such high scores were not included in the beta because Tesla is still extremely cautious about its deployment. Concerns about some of the capabilities of the FSD Beta program prompted immediate updates from Tesla. For example, the October 10.3 FSD release provided dysfunctional and inaccurate forward collision assist warnings when no threat was actually there. Tesla released an update to fix the bug less than 24 hours later.
Tesla reports safety statistics, but no longer since the second quarter of 2021. Statistics for that period included one crash every 4.41 million kilometers driven with autopilot technology, which includes Autosteer and active safety features. Drivers who did not use the autopilot had an accident for every 1.2 million kilometers traveled. NHTSA data indicates that there is one accident in the United States for every 484,000 miles traveled.
I would love to hear from you! If you have any comments, concerns or questions, please email me at [email protected]. You can also reach me on Twitter @KlenderJoey, or if you have any topical tips, you can email us at [email protected].