Diamond Member Pelican Press 0 Posted Sunday at 12:06 PM Diamond Member Share Posted Sunday at 12:06 PM This is the hidden content, please Sign In or Sign Up AI voice scams are on the rise – here’s how to stay safe, according to security experts AI voice-clone scams are on the rise, according to security experts Voice-enabled AI models can be used to imitate loved ones Experts recommend agreeing a safe phrase with friends and family The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members. What are AI voice scams? Scam calls aren’t new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family. The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to ******** phone scams automatically, encouraging victims to disclose sensitive information. So how can you stay safe? What makes the threat so problematic is not just how easily and cheaply it can be deployed, but how convincing AI voices have become. OpenAI faced backlash for its Sky voice option earlier this year, which sounded spookily like Scarlett Johansson, while Sir David Attenborough has This is the hidden content, please Sign In or Sign Up as “profoundly **********” by an AI voice clone which was indistinguishable from his real speech. (Image credit: Getty Images / d3sign) Just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media. Even tools designed to beat scammers demonstrate how blurred the lines have become. *** network O2 recently launched Daisy, an AI grandma designed to trap phone scammers in a time-wasting conversation, which they believe is with a real senior citizen. It’s a clever use of the technology, but also one that shows just how well AI can simulate human interactions. Disturbingly, fraudsters can train AI voices based on very small audio samples. According to F-Secure, a cybersecurity firm, just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media. How AI voice-cloning scams work The basic concept of a voice-clone scam is similar to standard phone scams: cybercriminals impersonate someone to gain the victim’s trust, then create a sense of urgency which encourages them to disclose sensitive information or transfer money to the fraudster. Sign up to be the first to know about unmissable ****** Friday deals on top tech, plus get all your favorite TechRadar content. The difference with voice-clone scams are two-fold. Firstly, the ********** can automate the process with code, allowing them to target more people, more quickly and for less money. Secondly, they are able to imitate not just authorities and celebrities, but people known directly to you. Safe Phrases: Stay safe against AI voice cloning – This is the hidden content, please Sign In or Sign Up This is the hidden content, please Sign In or Sign Up All that’s required is an audio sample, which is usually taken from a video online. This is then analyzed by the AI model and imitated, allowing it to be used in deceptive interactions. One increasingly common technique is for the AI model to imitate a family member requesting money in an emergency. The technology can also be used to simulate voices of high-profile individuals to manipulate victims. Scammers recently used an This is the hidden content, please Sign In or Sign Up , to try an ******** an investment ****. How to stay safe from AI voice scams According to This is the hidden content, please Sign In or Sign Up , a digital lender, 28% of *** adults say they have been targeted by AI voice-clone scams, yet only 30% are confident that they’d know how to recognize one. That’s why Starling launched its Safe Phrases campaign, which encourages friends and family to agree a secret phrase which they can use to confirm each other’s identity – and that’s a wise tactic. TL;DR How to stay safe (Image credit: Getty Images / Ronstick) 1. Agree a safe phrase with friends and family2. Ask the caller to confirm some recent private information3. Listen for uneven stresses on words or emotionless talk4. Hang up and call the person back5. Be wary of unusual requests, like requests for bank details Even without a pre-agreed safe phrase, you can use a similar tactic if you’re ever in doubt as to the veracity of a caller’s identity. AI voice clones can imitate a person’s speech pattern, but they won’t necessarily have access to private information. Asking the caller to confirm something that only they would know, such as information shared in the last conversation you had, is one step closer to certainty. Trust your ear as well. While AI voice clones are very convincing, they aren’t 100% accurate. Listen for tell-tale signs such as uneven stresses on certain words, emotionless expression or slurring. Scammers have the ability to mask the number they’re calling from and may even appear to be calling from your friend’s number. If you’re ever in doubt, the safest thing you can do is hang up and call the person back on the usual number you have for them. Voice-clone scams also rely on the same tactics as traditional phone scams. These tactics aim to apply emotional pressure and create a sense of urgency, to force you into taking an action your otherwise wouldn’t. Be alert to these and be wary of unusual requests, especially when it relates to making a money transfer. The same red flags apply to callers claiming to be from your bank or another authority. It pays to be familiar with the procedures used by your bank when contacting you. Starling, for example, has a This is the hidden content, please Sign In or Sign Up in its app, which can you check at any time to see if the bank is genuinely calling you. You might also like… This is the hidden content, please Sign In or Sign Up #voice #scams #rise #heres #stay #safe #security #experts This is the hidden content, please Sign In or Sign Up This is the hidden content, please Sign In or Sign Up Link to comment https://hopzone.eu/forums/topic/174426-ai-voice-scams-are-on-the-rise-%E2%80%93-here%E2%80%99s-how-to-stay-safe-according-to-security-experts/ Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now