Jump to content
  • Sign Up
×
×
  • Create New...

Recommended Posts

  • Diamond Member

Dutch organisations vulnerable to deepfake ******

Dutch companies are vulnerable to deepfake digital ****** due to the country’s high level of digitisation and working from home.  

Ali Niknam, CEO of Dutch digital bank 

This is the hidden content, please
, recently reported that an employee received a video call from him with an urgent request to transfer a hefty amount of money. “I must admit, I was genuinely surprised by how convincing my deepfake double was,” he wrote on LinkedIn. Fortunately, the employee did not fall for the deepfake, and the company was spared substantial financial damage. “Luckily, at bunq, we have processes in place that act as a failsafe in these situations,” he said. “Not only were we able to quickly detect the ******, but within minutes, everyone at bunq was alerted to this latest scheme.”

Although using deepfakes for CEO ****** is not new, bunq is one of the first companies to go public. 

“The [biggest] development of late is that there are now all kinds of tools and services offered on the dark web that make this technology applicable to a larger group of cyber **********,” said

This is the hidden content, please
, CEO of Orange Cyberdefense Netherlands. “Moreover, the technology is getting better and more realistic, making it ******* to spot. Previously, if a video call was dubious, you could ask someone to remove their glasses, but now a deepfake can do that, too.”

The improvements in technology combined with AI that allows it to use different languages are causing this ****** to scale considerably, and De Geus said Dutch organisations should be concerned. “Cyber ********** look first to countries where they can gain the most, and in the Netherlands, we have a very high level of digitalisation,” he said. “The Netherlands is the ********* champion of part-time and working from home, so digital platforms like Zoom and Teams are hugely established here. Moreover, the Dutch are used to arranging many things online, which makes it promising for cyber ********** to get started on this form of ****** here.” 

Although speaking other languages with AI is sometimes flawed, it’s probably only a matter of time before significant strides are made in this area.

This is the hidden content, please
, professor of computer vision at the University of Amsterdam, told news site NOS that it’s now possible to clone very realistic voices in addition to deepfake videos. The professor has been researching deepfake technology and how to recognise such manipulated images for years.

“Detection tools are not enough,” he said. “We are always behind the times now. Almost every week, a new tool comes along to generate something. That is a problem.”

De Geus also sees that detection tools still need to be improved. “This is an area where considerable investment exists because deepfake cyber ****** is a real threat,” he said. However, this is not necessarily true for every organisation. “Larger companies with large cash flows are attractive to cyber **********. And also organisations that work internationally, for example, because they are mostly used to digital meetings.” 

Although, for the time being, there seems to be a lot of hailstorming by cyber ********** when it comes to deepfake ****** – like there was with ransomware at the beginning – De Geus also knows of an international company with a branch in Hong Kong where a financial employee was invited to a call with the entire board of directors.

“Several calls were organised, after which the financial employee transferred $25m in 15 transactions,” he said. “It shows ********** go far to make targeted attacks look as real as possible.” This involves, for example, social engineering using information employees post on social networks or data and images found on the organisation’s website.

Therefore, coping with the threat of deepfakes requires more than just a detection tool – especially awareness among employees and setting up verification procedures. Several security vendors in the Netherlands have noticed that the threat concerns their customers.

Orange Cyberdefense is also having conversations with customers about this. “We offer additional training courses focused on awareness and dealing with this new threat,” said De Geus. “We point out to organisations that they don’t need to become very anxious suddenly, but should, above all, keep using their common sense. Suppose the CFO calls someone with an urgent request to transfer money; then, you can lay down simple procedures with extra checks outside the initial call, using code words, for instance, or calling back through another phone line.”

Jelle Wieringa, security awareness advocate at KnowBe4, also sees that the accessibility and ease of use of AI are making it increasingly easy to scam people and spread disinformation. “Cyber ********** are also using deepfakes to get behind sensitive business information,” he said. “They try to manipulate their victims to be able to commit ****** or steal data. Usually, they do this with phishing emails, but as the technology to produce deepfakes gets better and better, we also see them using audio for this.”

Wieringa gave the example of the company Retool, which, some time ago, had to deal with a hacker who had reproduced the voice of an IT help desk employee. “Another employee who thought he had his colleague on the line was scammed this way,” he said.  

Election ****** 

De Geus also urged organisations to pay attention to awareness, training and procedures against this threat. “Certainly organisations with increased risk, such as where there is a lot of international and remote work, or where there are large flows of money, it is very relevant to include this in your risk assessment,” he said.

When a company faces deepfake ******, it must determine how this could have happened. “Chances are that the network was compromised much earlier,” said De Geus, drawing a comparison to phishing and ransomware attacks. “It often turns out afterwards that the threat actor was already present in the network before the *******.” 

Incidentally, deepfakes are a risk not only for financial ******, but also for elections, for example. “With fake videos, you can influence voters and choice behaviour,” he said. “That is a realistic threat.”

This was recently the case in Slovakia, where deepfake videos were circulating on social media. In them, it appeared as if the leader of one of the political parties was discussing vote-buying with a journalist. Fact-checkers from news agency AFP, with the help of experts, concluded that the video was fake. 



This is the hidden content, please

#Dutch #organisations #vulnerable #deepfake #******

This is the hidden content, please


Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.