The Autopilot system was put out of action, after the fatal accident which involved a semi-autonomous car. Drivers must remain vigilant.
On May 7, 2016, a man in his 40s was killed driving a self-driving car in a collision with a truck on a Florida highway. The car was then piloted automatically, thanks to the Autopilot assisted driving system.
Six months later, the investigation is completed; it was dismissed and clears the driver assistance system installed in cars (Model S and Model X) marketed by Tesla.
“No software defects”
“The investigation did not find any fault in the software. There is no evidence that there is a defect,” Bryan Thomas, spokesman for the National Highway Traffic Safety Administration (NHTSA), said Thursday during a conference call, quoted by AFP.
In its report, the American highway safety agency thus insists on the fact that its investigators “did not identify any fault, either in the design or in the performance of the Autopilot emergency braking systems, or incidents in which these systems did not function properly”.
“Human Factors”
The accident would be linked to a “number of human factors”, specifies the report. In a first assessment dated June 2016, the NHTSA had mentioned several circumstances. “As far as we know, the vehicle was on a dual carriageway with Autopilot on when a tractor-trailer passed on the road, perpendicular to the Model S car. Neither the autopilot nor the driver noticed the white side of the trailer of the semi-trailer against a very bright sky, so the brakes were not applied”.
It is this track which therefore seems to be confirmed. According to the latest report, “the truck should have been seen by the driver of the Tesla at least seven seconds before the impact”. The driver should have braked himself. The authorities do not specify the reason for his distraction, but a witness told the AP agency at the time that the driver was watching a Harry Potter DVD at the time of the accident.
Not-so-autonomous cars
It is besides there all the ambiguity of this system of automatic piloting, which has vocation to assist the driver rather than to replace him. If it dismisses the software, the NHTSA says it is concerned about the use of the word “Autopilot in marketing and communication brochures” deployed by the entire automotive industry to praise technological prowess.
This designation “gives drivers the wrong idea of the car’s capabilities”, warns the NHTSA, which indicates that it discovered during its investigation of Tesla cars that the internal driver assistance system “requires the full attention of the driver. driver”. “A driver should never wait for the automatic braking to take place when he perceives a risk of collision,” she warns.
Tesla rolled out an upgrade to Autopilot in September that is more radar-based and capable of operating through rain, fog and snow.
But “that doesn’t mean there isn’t a safety flaw,” warns NHTSA, which says investigations into incidents involving cars equipped with semi-autonomous features showed drivers were ” confused” about their role behind the wheel.
.