The US auto safety regulators have identified a 12th crash involving Tesla vehicles using advanced driver assistance systems in incidents involving emergency vehicles and demanded the automaker answer detailed questions about its Autopilot system. The National Highway Traffic Safety Administration (NHTSA) said it had opened a formal safety probe into Tesla driver assistance system Autopilot after 11 crashes. The probe covers 765,000 U.S. Tesla vehicles built between 2014 and 2021. The safety agency sent Tesla a detailed 11-page letter with numerous questions it must answer, as part of its investigation.
Tesla’s Autopilot handles some driving tasks and allows drivers to keep their hands off the wheel for extended periods. Tesla says Autopilot enables vehicles to steer, accelerate and brake automatically within their lane. NHTSA said Tesla could face civil penalties of up to $115 million if it fails to fully respond to the questions.
Related: PM Imran Khan to Meet Tesla Executives to Discuss Launching EVs in Pakistan
On Saturday, the Florida Highway Patrol said the car of a Florida trooper who had stopped to assist a disabled motorist on a major highway was struck by a Tesla that the driver said was in Autopilot mode. According to a police report released on Wednesday, the trooper “narrowly missed being struck as he was outside of his patrol car.”
NHTSA said earlier it had reports of 17 injuries and one death in the 11 crashes. A December 2019 crash of a Tesla Model 3 left a passenger dead after the vehicle collided with a parked fire truck in Indiana. NHTSA’s request for information asks Tesla to detail how it detects and responds to emergency vehicles, as well as flashing lights, road flares, cones and barrels and to detail the impact of low light conditions.
Related: Tesla’s Keyless Entry System Could Be Hacked Within Seconds
Tesla in July introduced an option for some customers to subscribe to its advanced driver assistance software, dubbed “Full Self-Driving (FSD) capability.” NHTSA is seeking information on the “date and mileage at which the ‘Full Self Driving’ (FSD) option was enabled” for all vehicles, along with all consumer complaints, field reports, crash reports and lawsuits. The agency also wants Tesla to explain how it prevents use of the system outside areas where it is intended.
Among the detailed questions, NHTSA also asked Tesla to explain “testing and validation required prior to the release of the subject system or an in-field update to the subject system, including hardware and software components of such systems.” Tesla must respond to NHTSA’s questions by 22nd October it said, and it must disclose plans for any changes to Autopilot within the next 120 days.
Source: Reuters
A computer animation professional with over 23 years of industry experience having served in leading organizations, TV channels & production facilities in Pakistan. An avid car enthusiast and petrolhead with an affection to deliver quality content to help shape opinions. Formerly written for PakWheels as well as major publications including Dawn. Founder of CarSpiritPK.com