Tesla, which disbanded its public relations team in 2021, did not respond to WIRED’s request for comment. The camera systems the researchers used in their tests were manufactured by HP, Pelsee, Azdome, Imagebon, and Rexing; none of those companies responded to WIRED’s requests for comment.

Although the NHTSA acknowledges issues in “some advanced driver assistance systems,” the researchers are clear: They’re not sure what this observed emergency light effect has to do with Tesla’s Autopilot troubles. “I do not claim that I know why Teslas crash into emergency vehicles,” says Nassi. “I do not know even if this is still a vulnerability.”

The researchers’ experiments were also concerned solely with image-based object detection. Many automakers use other sensors, including radar and lidar, to help detect obstacles in the road. A smaller crop of tech developers—Tesla among them—argue that image-based systems augmented with sophisticated artificial intelligence training can enable not only driver assistance systems, but also completely autonomous vehicles. Last month, Tesla CEO Elon Musk said the automaker’s vision-based system would enable self-driving cars next year.

Indeed, how a system might react to flashing lights depends on how individual automakers design their automated driving systems. Some may choose to “tune” their technology to react to things it’s not entirely certain are actually obstacles. In the extreme, that choice could lead to “false positives,” where a car might hard brake, for example, in response to a toddler-shaped cardboard box. Others may tune their tech to react only when it’s very confident that what it’s seeing is an obstacle. On the other side of the extreme, that choice could lead to the car failing to brake to avoid a collision with another vehicle because it misses that it is another vehicle entirely.

The BGU and Fujitsu researchers did come with a software fix to the emergency flasher issue. Called “Caracetamol”—a portmanteau of “car” and the painkiller “Paracetamol”—it’s designed to avoid the “seizure” issue by being specifically trained to identify vehicles with emergency flashing lights. The researchers say it improves object detectors’ accuracy.

Earlence Fernandes, an assistant professor of computer science and engineering at University of California, San Diego, who was not involved in the research, said it appeared “sound.” “Just like a human can get temporarily blinded by emergency flashers, a camera operating inside an advanced driver assistance system can get blinded temporarily,” he says.

For researcher Bryan Reimer, who studies vehicle automation and safety at the MIT AgeLab, the paper points to larger questions about the limitations of AI-based driving systems. Automakers need “repeatable, robust validation” to uncover blind spots like susceptibility to emergency lights, he says. He worries some automakers are “moving technology faster than they can test it.”



Plus de détails sur l’article original.