Original Post by: Tara Seals | InfoSecurity | Published on: 08/05/2016 | Photo Credit: Posteriori
Elon Musk’s Tesla S has a very cool feature known as autopilot mode, designed for autonomous driving using radar and an array of sensors. Unfortunately, researchers have been able to sabotage the system, showing the potential to make surrounding objects “disappear” from the autopilot’s view.
Tesla’s autopilot detects the car’s surroundings using radar, ultrasonic sensors and cameras. A group of researchers at the University of South Carolina, China’s Zhejiang University and the Chinese security firm Qihoo 360 probed all three systems, but was only able to fully defeat the radar portion. But that’s enough—with the right hack, it would be possible to cause a high-speed collision.
The team used off-the-shelf radio-, sound- and light-emitting tools to fool the autopilot system, both masking real objects and spoofing ones that aren’t actually there. The systems array can masquerade as a car, then cause radio interference, drowning out the radio waves bouncing back to the Tesla and making the car go invisible to radar.
“The worst case scenario would be that while the car is in self-driving mode and relying on the radar, the radar is obscured and fails to detect an obstacle ahead of it,” said Wenyuan Xu, a University of South Carolina professor and research lead on the project, speaking to WIRED. “When there’s jamming, the ‘car’ disappears, and there’s no warning.”
The results, like the connected Jeep hacking shown at Blackhat last year, also make it clear that this kind of hack is far from easy—for it to work in the real world, another car would need to be outfitted with the jamming gear and drive along beside the Tesla. The barrier to entry also is fairly high and required expensive equipment, including a $90,000 signal generator from Keysight.
They demonstrated a few other hacks too, including hiding objects from the ultrasonic sensors by wrapping them in acoustic-dampening foam and tricking the Tesla self-parking mechanism into thinking there’s an object in its intended space. None of them seem to indicate any immediate danger, but do point out that there’s more work to do on the anti-hacking front.
“I don’t want to send out a signal that the sky is falling, or that you shouldn’t use autopilot. These attacks actually require some skills,” Xu said. “[Tesla] need to think about adding detection mechanisms as well. If the noise is extremely high, or there’s something abnormal, the radar should warn the central data processing system and say ‘I’m not sure I’m working properly.'”
Reference Article: http://www.infosecurity-magazine.com/news/researchers-hack-tesla-ss/