Jump to content
  • Sign Up
×
×
  • Create New...

AI voice scams are on the rise – here’s how to stay safe, according to security experts


Recommended Posts

  • Diamond Member

This is the hidden content, please

AI voice scams are on the rise – here’s how to stay safe, according to security experts


  • AI voice-clone scams are on the rise, according to security experts
  • Voice-enabled AI models can be used to imitate loved ones
  • Experts recommend agreeing a safe phrase with friends and family

The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.

What are AI voice scams?

Scam calls aren’t new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.

The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to ******** phone scams automatically, encouraging victims to disclose sensitive information.

So how can you stay safe? What makes the threat so problematic is not just how easily and cheaply it can be deployed, but how convincing AI voices have become.

OpenAI faced backlash for its Sky voice option earlier this year, which sounded spookily like Scarlett Johansson, while Sir David Attenborough has

This is the hidden content, please
as “profoundly **********” by an AI voice clone which was indistinguishable from his real speech.

(Image credit: Getty Images / d3sign)

Just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media.

Even tools designed to beat scammers demonstrate how blurred the lines have become. *** network O2 recently launched Daisy, an AI grandma designed to trap phone scammers in a time-wasting conversation, which they believe is with a real senior citizen. It’s a clever use of the technology, but also one that shows just how well AI can simulate human interactions.

Disturbingly, fraudsters can train AI voices based on very small audio samples. According to F-Secure, a cybersecurity firm, just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media.

How AI voice-cloning scams work

The basic concept of a voice-clone scam is similar to standard phone scams: cybercriminals impersonate someone to gain the victim’s trust, then create a sense of urgency which encourages them to disclose sensitive information or transfer money to the fraudster.

Sign up to be the first to know about unmissable ****** Friday deals on top tech, plus get all your favorite TechRadar content.

The difference with voice-clone scams are two-fold. Firstly, the ********** can automate the process with code, allowing them to target more people, more quickly and for less money. Secondly, they are able to imitate not just authorities and celebrities, but people known directly to you.

Safe Phrases: Stay safe against AI voice cloning –
This is the hidden content, please


This is the hidden content, please

All that’s required is an audio sample, which is usually taken from a video online. This is then analyzed by the AI model and imitated, allowing it to be used in deceptive interactions. One increasingly common technique is for the AI model to imitate a family member requesting money in an emergency.

The technology can also be used to simulate voices of high-profile individuals to manipulate victims. Scammers recently used an

This is the hidden content, please
, to try an ******** an investment ****.

How to stay safe from AI voice scams

According to

This is the hidden content, please
, a digital lender, 28% of *** adults say they have been targeted by AI voice-clone scams, yet only 30% are confident that they’d know how to recognize one. That’s why Starling launched its Safe Phrases campaign, which encourages friends and family to agree a secret phrase which they can use to confirm each other’s identity – and that’s a wise tactic.

TL;DR How to stay safe

(Image credit: Getty Images / Ronstick)

1. Agree a safe phrase with friends and family
2. Ask the caller to confirm some recent private information
3. Listen for uneven stresses on words or emotionless talk
4. Hang up and call the person back
5. Be wary of unusual requests, like requests for bank details

Even without a pre-agreed safe phrase, you can use a similar tactic if you’re ever in doubt as to the veracity of a caller’s identity. AI voice clones can imitate a person’s speech pattern, but they won’t necessarily have access to private information. Asking the caller to confirm something that only they would know, such as information shared in the last conversation you had, is one step closer to certainty.

Trust your ear as well. While AI voice clones are very convincing, they aren’t 100% accurate. Listen for tell-tale signs such as uneven stresses on certain words, emotionless expression or slurring.

Scammers have the ability to mask the number they’re calling from and may even appear to be calling from your friend’s number. If you’re ever in doubt, the safest thing you can do is hang up and call the person back on the usual number you have for them.

Voice-clone scams also rely on the same tactics as traditional phone scams. These tactics aim to apply emotional pressure and create a sense of urgency, to force you into taking an action your otherwise wouldn’t. Be alert to these and be wary of unusual requests, especially when it relates to making a money transfer.

The same red flags apply to callers claiming to be from your bank or another authority. It pays to be familiar with the procedures used by your bank when contacting you. Starling, for example, has a

This is the hidden content, please
in its app, which can you check at any time to see if the bank is genuinely calling you.

You might also like…



This is the hidden content, please

#voice #scams #rise #heres #stay #safe #security #experts

This is the hidden content, please

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.