Discover radar sensors can be fooled - with difficulty.
A team of researchers have discovered that it is possible to fool the radar sensors used by Tesla’s Autopilot system into not ‘seeing’ another car or obstacle in the road ahead.
“The worst case scenario would be that while the car is in self-driving mode and relying on the radar, the radar is obscured and fails to detect an obstacle ahead of it,” said Wenyuan Xu, professor at the University of South Carolina who led the research. Other researchers from Zhejiang University in China and Chinese security firm Qihoo 360 contributed to the project.
The safety of the Autopilot system and self-driving systems generally has been called into question in recent months, after a fatal crash involving a Tesla Model S running in Autopilot mode, which occurred when the sensors failed to 'see' a white semi-trailer against a bright sky.
The researchers found that Tesla’s radar sensors are vulnerable to an attack from radio-jamming equipment. But doing so is not easy or cheap. A signal generator costing $90,000 was used alongside a frequency multiplier costing several hundred dollars more.
The equipment was placed on a cart in front of a Model S, both stationary. When it was turned on, the jamming signal made the cart ‘disappear’, with no warnings being displayed in the car. According to the researchers, the technology could be used to conceal objects in the road, causing a crash. However, in practice the equipment would either have to be sacrificed in the collision, or else aimed with extreme precision at the radar sensor from the roadside.
Further experiments found the Tesla’s ultrasonic sensors - used for self-parking and ‘summon’ functions - could be circumvented by a sound generator or acoustic dampening foam. But attempts to ‘blind’ the cameras with lasers and LED lights failed as the Autopilot system simply automatically disengaged.
Xu admitted the techniques used aren’t exactly practical, but stressed the need for vigilance. She said: “I don’t want to send out a signal that the sky is falling, or that you shouldn’t use Autopilot. These attacks actually require some skills. But highly motivated people could use this to cause personal damage or property damage.
“Overall we hope people get from this work that we still need to improve the reliability of these sensors. And we can’t simply depend on Tesla and not watch out for ourselves,” Xu added.