News

How to stay safe from AI voice frauds?

Voice-enabled AI models are being used by scammers to automate fraudulent schemes, targeting more people faster and for less money. To protect against AI voice-cloning scams, use tiny audio samples and trust your ear. Digital lender Starling Bank has launched a Safe Phrases campaign to encourage identity verification.

Your ear won’t be able to distinguish between a real person and the next spam call you get. Scammers are using voice-enabled AI models to automate their fraudulent schemes, deceiving people by mimicking actual human calls, such as family members.

AI Voice Scams: What Are They?

While scam calls themselves are not new, AI-powered scams are a new and hazardous type. They mimic friends and family members as well as celebrities and authorities using generative AI.

When it comes to phone fraud, the introduction of AI models trained on human voices has opened up a new area of danger. These tools facilitate real-time communication between a human and the AI model, such as OpenAI’s speech API. These models may be set up to run phone scams automatically with a little coding, which incentivizes victims to divulge private information.

Also Read: Gmail Users Alert: AI Voice Scams Targeting 2.5 Billion Accounts, Says Microsoft Expert

So, how can you protect yourself? The threat is so dangerous not only because it can be used so cheaply and easily, but also because AI voices have gotten so realistic.

Sir David Attenborough has called himself “profoundly disturbed” by an AI voice clone that was indistinguishable from his speaking, while OpenAI has attracted criticism for its Sky voice choice earlier this year, which sounded uncannily similar to Scarlett Johansson.

How hazy the lines have gotten is evident even in instruments made to defeat scammers. Daisy, an AI grandmother created to trick phone scammers into spending time in a discussion they think is with a real senior citizen, was recently introduced by UK network O2. In addition to being a creative use of technology, it demonstrates how accurately AI can mimic human interactions.

It’s concerning that scammers can use tiny audio samples to train AI voices. A cybersecurity company called F-Secure claims that a few seconds of audio can replicate a loved one’s voice. This could easily be taken from a social media video.

Also Read: How to Identify the Difference Between Human and AI Voice?

HOW AI VOICE CLONING SCAMS WORK

The fundamental idea behind voice-clone schemes is the same as that of traditional phone scams: hackers pose as someone to win the victim’s trust, then instill a sense of urgency that persuades them to give the fraudster money or sensitive information.

Scams involving voice clones differ in two ways. First of all, by using programming to automate the process, the crooks may target more people faster and for less money. Second, they can mimic people you know personally in addition to celebrities and authorities.

An audio sample, typically from an online video, is all that is needed. The AI model then analyzes and mimics this, enabling its usage in fraudulent interactions. The AI model mimicking a family member asking for money in an emergency is one method that is becoming more popular.

To control victims, the technology can also be used to mimic the voices of well-known people. Recently, scammers attempted to carry out an investment scam using an AI speech clone of Queensland Premier Steven Miles.

Also Read: Rabbit R1 Faces Allegations of NFT Scam by Coffeezilla

WAYS TO PREVENT AI VOICE SCAMS

Digital lender Starling Bank reports that while 28% of UK individuals claim to have been the victim of an AI voice-clone scam, only 30% are certain they would be able to spot one. For this reason, Starling started its Safe Phrases campaign, which is a smart strategy that encourages friends and family to decide on a secret word that they may use to verify their identities.

If you’re ever unsure about the identity of a caller, you can employ a similar strategy even in the absence of a pre-established safe word. Although AI voice clones can mimic a person’s speech patterns, they might not have access to personal data.

You can get closer to assurance by asking the caller to confirm information that only they would know, including details from your most recent chat.

You should also trust your ear. Artificial intelligence voice clones are incredibly realistic, yet they are not entirely correct. Keep an ear out for warning indicators like slurring, emotionless countenance, or unequal emphasis on specific words.

Scammers may even pretend to be calling from your friend’s number because they can conceal the number they are using. The safest course of action is to end the call and give the individual a call back at the number you typically have for them if you’re ever unsure.

Also Read: Criminal Use of AI: Trends and Tactics Revealed

The same strategies used in traditional phone scams are also used in voice-clone frauds. These strategies are designed to put emotional pressure on you and make you feel compelled to behave when you normally wouldn’t. Keep an eye out for these and be cautious of odd requests, particularly when they involve sending money.

Callers posing as representatives of your bank or another authority should raise the same concerns. Knowing the steps your bank takes to get in touch with you pays well. For instance, you may check the call status indication in Starling’s app at any time to determine if the bank is actually phoning you.

This post was last modified on November 25, 2024 7:37 pm

Kumud Sahni Pruthi

A postgraduate in Science with an inclination towards education and technology. She always looks for ways to help people improve their lives by putting complex things into simple words through her writing.

Recent Posts

Google is moving Android news to a virtual event before I/O

Google is launching The Android Show: I/O Edition, featuring Android ecosystem president Sameer Samat, to…

April 29, 2025

Top Generative AI Companies of the World 2025

The top 11 generative AI companies in the world are listed below. These companies have…

April 28, 2025

Veo 2 extends access to more Gemini Advanced Users

Google has integrated Veo 2 video generation into the Gemini app for Advanced subscribers, enabling…

April 25, 2025

Perplexity launches the iPhone voice assistant

Perplexity's iOS app now makes its conversational AI voice assistant compatible with Apple devices, enabling…

April 24, 2025

Ola’s AI arm Krutrim intends to raise $300 million

Bhavish Aggarwal is in talks to raise $300 million for his AI company, Krutrim AI…

April 22, 2025

World’s first humanoid half-marathon pits people against robots

The Beijing Humanoid Robot Innovation Center won the Yizhuang Half-Marathon with the "Tiangong Ultra," a…

April 22, 2025