AI and deepfake tech making scammers more dangerous, experts say

While deepfake technology can have its goofy uses — like transposing your own face onto a star’s for a viral video — cybersecurity experts say criminals are increasingly using it to scam people. 

Bloomberg reports such tech will “turbocharge the cybertheft economy,” with fraudsters using everything from easily-available software that can make them sound like your own kids, to 3D printing to make a mask of a victim to trick facial recognition software. 

As Mission: Impossible as that sounds, the technology is here now, and banking and cybersecurity experts are scrambling to keep up. 

Rob Pope, director of New ­Zealand’s government cybersecurity agency CERT NZ tells Bloomberg, “What AI does is accelerate the levels of sophistication and the ability of these bad people to pivot very quickly. AI makes it easier for them.” 

He adds, “It’s a fair bet that over the next two or three years we’re going to see more AI-generated criminal attacks.”

Making things easier for the crooks? Social media. Scammers can, for example, get a short clip of your loved one’s voice on a post and quickly clone that voice with off-the-shelf software. From there, this loved one can reach out to you or your social media contacts and ask for money.

Selfies and other photos can be exploited, too, with crooks “lifting” the face from a picture and mapping it onto a new avatar — or even a physical 3D lookalike — for transactions requiring photo proof of identity.

That said, banks and security companies are also using AI to sniff out scammy transactions, but it’s an uphill battle as the bad guys are quick to adapt to changing technology.