A group of researchers is claiming it has successfully fooled several aspects of Tesla’s ‘Autopilot’ feature, which eventuated in the system not recognising real-world obstacles in the vehicle’s path.
The group, a collaborative effort of researchers from China’s Zhejiang University, University of South Carolina and Chinese security firm Qihoo 360, says it will release the full details of the hack in an upcoming session at Defcon hacker conference.
However, the group has released information saying they were able to fool Tesla’s autopilot system using radio, sound and light-emitting componentry widely available off the shelf.
The group says that almost all the tests were conducted with a stationary vehicle, with varying rates of success. Wenyuan Xu, a professor from the University of South Carolina, told Wired:
“The worst case scenario would be that while the car is in self-driving mode and relying on the radar, the radar is obscured and fails to detect an obstacle ahead of it. That would be a bad thing.”
The group found the most cost-effective means of disrupting Tesla’s system was to attack its short-range ultrasonic sensor, which is utilised for self-parking and the summon feature. Using equipment totaling US$40, the team was able to trick the Tesla’s brain into thinking there was an obstacle obstructing its path, despite the test area being clear.
A similar method was employed to fool the Tesla’s sensors using acoustic dampening foam, leaving the system unable to identify surrounding objects.
Another less affordable means of hacking the Tesla’s brain was achieved using a $90,000 signal generator and much more expensive VDI frequency multiplier that managed to jam the Tesla’s radio signal. “When there’s jamming, the car disappears, and there’s no warning,” professor Xu said.
Professor Xu says the effort is to ensure that Tesla’s system can be further improved to reduce the chance of incidents like the crash that killed a man earlier this year.
“I don’t want to send out a signal that the sky is falling, or that you shouldn’t use autopilot. These attacks actually require some skills. But highly motivated people could use this to cause personal damage or property damage….Overall we hope people get from this work that we still need to improve the reliability of these sensors. And we can’t simply depend on Tesla and not watch out for ourselves,” Xu said.
This team is not the first to attempt to fool Tesla’s systems. As you can see in this video below, Marc Rogers and Kevin Mahaffey successfully managed to hack a Model S’ infotainment system and take control of the system over a year ago.