1 in 4 people Impacted by Artificial Intelligence Voice Scams

A recent study by computer security company McAfee reported that one in four people have been impacted by artificial intelligence voice scams. The report, The Artificial Imposter, provides insight on how AI is fueling a disturbing trend in cybercrime – the rise in online voice scams.

77% of victims in AI-enabled scam calls said they lost money, according to the report, which surveyed 7,054 people from seven countries. The study found that a quarter of adults had previously experienced some kind of AI voice scam, with 1 in 10 targeted personally and 15% saying it happened to someone they know. 

These AI-enabled voice scams feed on fear, creating a sense of urgency and convincing people that their friend or family member is in distress and needs money to get out of legal trouble or some other pending crisis. The technology of AI allows scammers to use an audio snippet of someone’s voice online to replicate their voice into an authentic sounding clip.

“Artificial intelligence brings incredible opportunities, but with any technology there is always the potential for it to be used maliciously in the wrong hands. This is what we’re seeing today with the access and ease of use of AI tools helping cybercriminals to scale their efforts in increasingly convincing ways,” said Steve Grobman, McAfee CTO.

Criminals upload the clip to an AI-generated program that replicates the voice and unfortunately the technology that enables this to happen is now extremely inexpensive. Some voice-replicating services cost as little as $15 a month.

Everybody’s voice is unique, the spoken equivalent of a biometric fingerprint, which is why hearing someone speak (or in these instances thinking we are hearing someone speak) is such a widely accepted way of establishing trust. But with 53% of adults sharing their voice data online at least once a week (via YouTube videos, social media posts, podcasts, etc.) and 49% doing so up to 10 times a week, cloning how somebody sounds is now another tool being used to commit cybercrimes. There are programs readily available online that harness AI to clone a voice with as little as 3 minutes of uploaded data.

McAfee researchers spent three weeks investigating the accessibility, ease of use, and efficacy of AI voice-cloning tools, with the team finding more than a dozen freely available on the internet. And many require only a basic level of experience and expertise to use. In one instance, just three seconds of audio was enough to produce an 85% match, but with more time and effort, it’s possible to result in even more accuracy.

How can people circumvent the uptick in AI voice-cloning scams? According to the FTC, if you receive a call like this, the first step is to call the person who supposedly called you, using a phone number you already know and verify the story using a pre-determined codeword to be used during calls of distress to ensure that they are legitimate. If you can’t get in touch with the person, look out for requests to send money in atypical way, such as wire transfers or cryptocurrency.

Technology can be harnessed in many positive ways, but unfortunately it can be used as a tool by criminals. Always question the source of a strange phone call, think before you click and share information online and protect your personal data through tools such as Enfortra’s PII removal product.

Exit mobile version