Google Duplex is a proof-of-concept extension of the Google Assistant technology found on many Google and Android devices that, in a recent demo, almost perfectly imitated a human voice over the phone. While this technology is exciting, having an AI engine that can trick people over the phone into thinking that it’s a bona fide human has some troubling privacy and security implications. WatchGuard CTO Corey Nachreiner breaks down some of the security and privacy risks of Google Duplex in his latest column for GeekWire.
Corey’s largest concern is that an AI voice that can pass the Turing test could also have the ability to make automated phishing phone calls. Called “voice phishing” or “vishing” in the information security community, this tactic basically means tricking a person over the phone into giving you information you shouldn’t have (like a PIN number or login credentials). These attacks don’t really scale, since (until now) a hacker had to make each call in person. But an AI like Google Duplex that can convincingly imitate a human could make these attacks on a massive scale.
Furthermore, this technology theoretically allows an attacker to imitate someone’s voice, so that fake vishing calls could seem to come from a friend or boss. Here’s an excerpt from Corey’s article discussing that frightening possibility.
AI and machine learning are also allowing researchers to mimic our voices and images as well. In fact, last year a company called Lyrebird unveiled a voice imitation algorithm that could mimic a real person simply based on having a small audio snippet of their voice…Spear-phishing has become a huge threat. Often, sophisticated hackers learn a bit about you, and use that knowledge to design emails that you’re more likely to respond to. One example is crafting an email that appears to come from your boss. Now imagine a call that comes from your boss. It sounds like your boss, talks with his or her cadence, but is actually an AI assistant using a voice imitation algorithm. Such a call offers limitless malicious potential to bad actors.
Read the full article in GeekWire to learn what other elements of Google Duplex give Corey pause (and the safeguards that Google has in place to prevent abuse) and read more about phishing and spear phishing attacks on Secplicity.
So the question quickly becomes? “Where is the line in the sand?”.. The moment and time the duplex assistant forms an opinion and / or testifies against you, its too late…. Privacy is only as good as the dialogues you think are supposed to be held in confidence…. Oh, wait; did I just say that privately? Did someone just hear what I said?
Yep, that is what I thought…