Sophisticated’ AI Voice Cloning Scam on Rise, Survey Shows

Conning people out of money is a behavior as old as the human race itself, but modern computer technology is making it easier, and scarier, than ever.

The UK’s Starling Bank recently sent a survey to more than 3,000 Brits to gauge how often the public is targeted by scammers using artificial intelligence (AI). The survey found that 25 percent of respondents say they have been targeted by scams using “voice cloning” technology at least one time within the past year.

Voice cloning is the process of feeding samples of a known person’s voice into an AI program, which then recreates the voice so closely that most people would never know the difference. AI technology has gotten so good that videos can be created featuring known people that are virtually indistinguishable from the real thing.

Actor Tom Hanks found that out the hard way last year when he discovered a video apparently of him hawking a dental plan that he had never heard of. Just this week, California Governor Gavin Newsom signed a new law making it illegal to produce “deep fake” AI-generated political ads.

Back in the UK, the average adult is targeted by some kind of financial scam at least five times a year. That number may rise as deep fake voice technology spreads. All a con artist needs is a small sample of a real person’s voice in order to have that “person” then recite words that they never actually spoke.

And they can get them from anywhere: voicemail, or from any video or audio recordings of a person’s voice that end up on the internet. Starling Bank’s Lisa Grahame, chief of information security, said scammers “only need three seconds” of a person’s voice to do it.

Once the criminals can generate a fake voice, they target the persons’ friends and family. Then the family members get a phone call, apparently from an “aunt,” for example, claiming that their loved one is in dire straits and needs money. Popular cover stories include AI voices claiming that the family member has been arrested and needs bail money, or that the family member is stuck in a vacation destination after having a wallet stolen.

The Starling Bank survey also found that about 10 percent of people admitted that they would do whatever the fake voice asked, even if they initially questioned whether it was truly their loved one on the other end of the line.

Most people don’t know how to detect such scams, and doing so is not always possible. Experts advise listening for any odd pauses, or any phrases that sound unnatural. Those who want to be extra careful can agree on a “password” that only their family knows. If a “family member” calls with a financial emergency, ask for the password. It’s an easy, low-tech way to make sure you’re talking to your daughter Jill, and not a disembodied robot.