Adversarial Machine Learning against Tesla's Autopilot
Schneier on Security
APRIL 4, 2019
Researchers have been able to fool Tesla's autopilot in a variety of ways, including convincing it to drive into oncoming traffic. It requires the placement of stickers on the road. Abstract: Keen Security Lab has maintained the security research work on Tesla vehicle and shared our research results on Black Hat USA 2017 and 2018 in a row. Based on the ROOT privilege of the APE (Tesla Autopilot ECU, software version 18.6.1), we did some further interesting research work on this module.
Let's personalize your content