Tesla has been pressed by the US Department of Transportation and the National Highway Traffic Safety Administration (NHSTA) for information regarding its driver-assistance technologies, specifically if it has banned those people testing the features from reporting any potential safety concerns.
On Tuesday, October 12, the regulator asked Elon Musk's electric car business to submit data on non-disclosure agreements (NDAs) with users who have been testing the new feature since October 2020.
It is part of a preliminary investigation launched after a string of accidents involving emergency vehicles.
Tesla Autopilot Crashes
According to Nasdaq, two persons were killed on September 13 in Coral Gables, Florida, after a Tesla Model 3 crashed with a tree and caught fire.
Following the accident, the National Transportation Safety Board (NTSB) announced that a team of three investigators were dispatched to the area to investigate the deadly collision.
The NTSB, which makes safety recommendations but does not regulate automakers, stated that it will extensively examine the Tesla Model 3's new technologies. Furthermore, they will look into the operation of the electric automobile, as well as the post-crash fire that completely destroyed the vehicle.
The Coral Gables Police Department announced that it was unclear whether the Tesla Model 3 that was involved in the crash was utilizing Tesla's Autopilot driver-assistance technology.
The NTSB will begin its almost immediately and expected to finish it within a week, with a preliminary report due in roughly 30 days.
That was the agency's second investigation into a deadly accident involving Tesla vehicles in less than six months.
The board was also looking into a deadly collision in Texas last April, but this time involving a Tesla Model S car that yet again collided with a tree and burst into flames, killing both passengers.
This isn't the first time the electric vehicle manufacturer has attracted attention for the wrong reasons. Several Tesla vehicles have also been involved in accidents in the past, with the company's Autopilot semi-autonomous driving aid technology being blamed. The car's battery is said to have sparked fires in some circumstances.
Read also: Elon Musk Hilariously Jokes 'Mad Max' Upgrade for Tesla Cybertruck: Guitar Flamethrower Coming?
Tesla Full Self-Driving (FSD) Feature
The Full Self-Driving (FSD) feature allows Tesla cars to detect stop signs and turn at intersections, whereas the existing Autopilot function is mostly used to control speed and keep the vehicle in a lane.
As reported by the First Post, the NHTSA received a report allegedly saying that the non-disclosure agreements that these testers sign includesthe participant's silence from sharing negative confidential information about the Tesla FSD feature.
The public agency wrote a letter to Tesla, which stated that "Any agreement that may prevent or dissuade participants in the early access beta release program from reporting safety concerns to NHTSA is unacceptable," demanding the company to respond until November 1.
The NHTSA further asked Tesla in another letter, to explain why it has not issued a recall despite updating its driver-assistance software to increase the recognition of emergency vehicle lights at night.
Manufacturers are required to recall automobiles if safety issues are discovered, according to the safety watchdog.
The government also inquired about how the business chose the drivers who began testing a new version of its self-driving technology, dubbed FSD Beta 10.2, earlier this month.
On Monday, October 11, Tesla CEO and one of the world's richest men, Elon Musk, revealed on Twitter that this version was being rolled out to the safest drivers in the firm.
After a spate of mishaps involving electric vehicles, Autopilot technology has sparked debate.
In addition, Tesla's decision to test beta versions of new assistance capabilities with regular drivers in real-world situations without seeking official clearance is fueling more intensity in the debate.