How voice cloning through artificial intelligence is being used for scams | Explained Premium
The Hindu
AI voice cloning scams & disinformation have become rampant, with India topping the list of victims. FTC has launched a challenge to detect & monitor cloned devices, while the global market for these applications is estimated to reach $5 billion in 2032.
A few years ago, voice cloning through Artificial Intelligence (AI) was just a phenomenon of mild amusement. AI-generated songs by famous artistes like Drake and Ariana Grande were floating around online. However, fears around the AI software were realised when AI voice cloning-related scams burgeoned. In April last year, a family living in Arizona, U.S., was threatened to pay ransom for a fake kidnapping pulled off by an AI cloned voice. And scams weren’t the end of it. Easy access to AI voice clones also spawned disinformation.
Earlier in January, 4chan users started flocking to free AI voice cloning tools to generate celebrity hate speech, wherein Harry Potter actress Emma Watson read out a portion of the Mein Kampf and conservative political pundit Ben Shapiro made racist comments against Democrat politician Alexandra Ocasio-Cortez.
Similar incidents have made their way in India. A report titled ‘The Artificial Imposter’ published in May last year revealed that 47% of surveyed Indians have either been a victim or knew someone who had fallen prey to an AI generated voice scam. The numbers are almost twice the global average of 25%. In fact, India topped the list with the maximum number of victims to AI voice scams. Even as several cases went unreported, some came to light. In December, a Lucknow resident fell prey to a cyberattack that used AI to impersonate the voice of the victim’s relative, requesting the person to transfer a substantial amount through UPI. Another report in August stated that a man from Haryana was duped of ₹30,000 after a call was made from a scamster who used an AI app to sound like the victim’s friend in dire need of money due to an accident.
Indians have been found to be particularly vulnerable to scams of this nature. According to McAfee, 66% of Indian participants admitted that they would respond to a voice call or a phone call that appeared to be from a friend or family member in urgent need of money, especially if the caller was supposedly a parent (46%), spouse (34%) or their child (12%). The report stated that messages that claimed the sender had been robbed (70%), involved in a car accident (69%), lost their phone or wallet (65%) or needed financial aid while travelling abroad (62%) were the most effective excuses.
While these tools aren’t perfect, scammers have relied on creating a sense of exigency to glide over these flaws. The report also shared that 86% Indians were prone to sharing their voice data online or via voice notes at least once a week which has made these tools potent.
Once a scammer finds an audio clip of an individual, all it takes is to upload their voice clip to the online program that is able to replicate the voice accurately barring some intonations. There’s a host of these applications online with popular ones like Murf, Resemble and Speechify. While most of these providers have a monthly subscription fee from under $15 for basic plans to $100 for premium options, they have a free trial period.
An especially lauded one has been a year-old AI startup called ElevenLabs that was founded by former Google and Palantir employees. The Andreesen Horowitz-backed firm has been releasing a steady stream of tools. In October last year, it released a product called AI Dubbing which can translate even long-form speech into 20 different languages.