Jump to content
  • Sign Up
×
×
  • Create New...

Deepfake scams have looted millions. Experts warn it could get worse


Recommended Posts

  • Diamond Member



Deepfake scams have looted millions. Experts warn it could get worse

3D generated face representing artificial intelligence technology

Themotioncloud | Istock | Getty Images

A growing wave of deepfake scams has looted millions of dollars from companies worldwide, and cybersecurity experts warn it could get worse as ********** exploit generative AI for ******.

A deep fake is a video, sound, or image of a real person that has been digitally altered and manipulated, often through artificial intelligence, to convincingly misrepresent them.

In one of the largest known case this year, a Hong Kong finance worker was duped into transferring more than $25 million to fraudsters using deepfake technology who disguised themselves as colleagues on a video call, 

This is the hidden content, please
   

Last week, *** engineering firm Arup confirmed to CNBC that it was the company involved in that case, but it could not go into details on the matter due to the ongoing investigation. 

Such threats have been growing as a result of the popularization of Open AI’s Chat GPT — launched in 2022 — which quickly shot generative AI technology into the mainstream, said David Fairman, chief information and security officer at cybersecurity company Netskope.

“The public accessibility of these services has lowered the barrier of entry for cyber ********** — they no longer need to have special technological skill sets,” Fairman said.

The volume and sophistication of the scams have expanded as AI technology continues to evolve, he added.

Rising trend 

Various generative AI services can be used to generate human-like text, image and video content, and thus can act as powerful tools for illicit actors trying to digitally manipulate and recreate certain individuals. 

A spokesperson from Arup told CNBC: “Like many other businesses around the globe, our operations are subject to regular attacks, including invoice ******, phishing scams, WhatsApp voice spoofing, and deepfakes.”

The finance worker had reportedly attended the video call with people believed to be the company’s chief financial officer and other staff members, who requested he make a money transfer. However, the rest of the attendees present in that meeting had, in reality, been digitally recreated deepfakes. 

Arup confirmed that “fake voices and images” were used in the incident, adding that “the number and sophistication of these attacks has been rising sharply in recent months.” 

This is the hidden content, please
in Shanxi province this year involving a female financial employee, who was tricked into transferring 1.86 million yuan ($262,000) to a fraudster’s account after a video call with a deepfake of her boss. 

Broader implications 

In addition to direct attacks, companies are increasingly worried about other ways deepfake photos, videos or speeches of their higher-ups could be used in malicious ways, cybersecurity experts say.

According to Jason Hogg, cybersecurity expert and executive-in-residence at Great Hill Partners, deepfakes of high-ranking company members can be used to spread fake news to manipulate stock prices, defame a company’s brand and sales, and spread other harmful disinformation. 

“That’s just scratching the surface,” said Hogg, who formerly served as an FBI Special Agent. 

He highlighted that generative AI is able to create deepfakes based on a trove of digital information such as publicly available content hosted on social media and other media platforms. 

In 2022, Patrick Hillmann, chief communications officer at Binance, claimed in a 

This is the hidden content, please
 scammers had made a deepfake of him based on previous news interviews and TV appearances, using it to trick customers and contacts into meetings.

Netskope’s Fairman said such risks had led some executives to begin wiping out or limiting their online presence out of ***** that it could be used as ammunition by cybercriminals. 

Deepfake technology has already become widespread outside the corporate world.

From

This is the hidden content, please
to manipulated videos
This is the hidden content, please
have fallen victim to deepfake technology. Deepfakes of politicians have also been rampant.

Meanwhile, some scammers have

This is the hidden content, please
in attempts to fool them out of money.

According to Hogg, the broader issues will accelerate and get worse for a ******* of time as cybercrime prevention requires thoughtful analysis in order to develop systems, practices, and controls to defend against new technologies. 

However, the cybersecurity experts told CNBC that firms can bolster defenses to AI-powered threats through improved staff education, cybersecurity testing, and requiring code words and multiple layers of approvals for all transactions — something that could have prevented cases such as Arup’s. 





This is the hidden content, please

Cybersecurity,******,Alphabet Inc,business news
#Deepfake #scams #looted #millions #Experts #warn #worse

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.