Site icon Secplicity – Security Simplified

Researchers Uncover Potentially Fatal Flaw in Tesla Autopilot  

Security issues with connected cars are nothing new. Several years ago, researchers demonstrated that they could run a Jeep off the road after compromising the vehicle’s connected system. And last year, researchers uncovered a bug that allowed them to access the back end systems of an internet-connected vehicle management systems that could allow them to locate and unlock cars, disable their alarms and start the engines. But, according to Ars Technica, researchers from Tencent’s Keen Security Lab recently shed light on a different type of security vulnerability in a Tesla Model S 75 – one with potentially fatal consequences.

Without hacking into the car’s computing system, they were able to prove that a Tesla can be tricked into automatically changing lanes and driving into oncoming traffic. How? The researchers subtly altered the vehicle’s driving environment, placing discreet stickers across the roadway so that the Tesla’s Enhanced Autopilot feature would detect and follow the directional change in the current lane.

In the full report, the researchers explained:

Tesla autopilot module’s lane recognition function has a good robustness in an ordinary external environment (no strong light, rain, snow, sand and dust interference), but it still doesn’t handle the situation correctly in our test scenario. This kind of attack is simple to deploy, and the materials are easy to obtain. As we talked in the previous introduction of Tesla’s lane recognition function, Tesla uses a pure computer vision solution for lane recognition, and we found in this attack experiment that the vehicle driving decision is only based on computer vision lane recognition results. Our experiments proved that this architecture has security risks and reverse lane recognition is one of the necessary functions for autonomous driving in non-closed roads. In the scene we build, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.

Although this type of attack isn’t the result of an issue with Tesla’s software, it does highlight one of the many potential security risks presented by connected vehicles. Elon Musk himself acknowledged the research as “solid work.”

You can read more about this vulnerability in the complete write-up in Ars Technica. Learn more about connected car hacks and stay up to date on the latest security news and best practices here on Secplicity.

Exit mobile version