Security issues with connected cars are nothing new. Several years ago, researchers demonstrated that they could run a Jeep off the road after compromising the vehicle’s connected system. And last year, researchers uncovered a bug that allowed them to access the back end systems of an internet-connected vehicle management systems that could allow them to locate and unlock cars, disable their alarms and start the engines. But, according to Ars Technica, researchers from Tencent’s Keen Security Lab recently shed light on a different type of security vulnerability in a Tesla Model S 75 – one with potentially fatal consequences.
Without hacking into the car’s computing system, they were able to prove that a Tesla can be tricked into automatically changing lanes and driving into oncoming traffic. How? The researchers subtly altered the vehicle’s driving environment, placing discreet stickers across the roadway so that the Tesla’s Enhanced Autopilot feature would detect and follow the directional change in the current lane.
In the full report, the researchers explained:
“Tesla autopilot module’s lane recognition function has a good robustness in an ordinary external environment (no strong light, rain, snow, sand and dust interference), but it still doesn’t handle the situation correctly in our test scenario. This kind of attack is simple to deploy, and the materials are easy to obtain. As we talked in the previous introduction of Tesla’s lane recognition function, Tesla uses a pure computer vision solution for lane recognition, and we found in this attack experiment that the vehicle driving decision is only based on computer vision lane recognition results. Our experiments proved that this architecture has security risks and reverse lane recognition is one of the necessary functions for autonomous driving in non-closed roads. In the scene we build, if the vehicle knows that the fake lane is pointing to the reverse lane, it should ignore this fake lane and then it could avoid a traffic accident.”
Although this type of attack isn’t the result of an issue with Tesla’s software, it does highlight one of the many potential security risks presented by connected vehicles. Elon Musk himself acknowledged the research as “solid work.”
You can read more about this vulnerability in the complete write-up in Ars Technica. Learn more about connected car hacks and stay up to date on the latest security news and best practices here on Secplicity.
Kristin says
But the potentially fatal flaw was not enough to merit Tesla’s Bug Bounty program and was not considered a real world scenario according to Tesla’s statement since the driver could easily take back control of the car assuming they are paying attention, which they always should be. The primary vulnerability in Tesla’s autopilot system which would result in the remote control of the car was already patched 2017-2018. Not a fan of self-driving anything, but I’m sure the Wright Brothers had issues to deal with before they took first flight. If Tesla has to wait for things to be perfect, they’ll never “get off the ground”. Will be interesting to see this unfold.
Dave Purscell says
Interesting article. Curious how the reverse lane recognition functions during Minnesota’s other season… Road Construction. Crossovers happen frequently (and totally confuse Google Maps when it happens). Crossovers sometime happen with little or no advance notice. Contraflow is another tool which can be used in disasters (such as hurricane evacuation) and to manage congestion.