Jump to content
  • Sign Up
×
×
  • Create New...

‘Would love to see her faked’: the dark world of sexual deepfakes – and the women fighting back | Deepfake


Recommended Posts

  • Diamond Member

This is the hidden content, please

‘Would love to see her faked’: the dark world of ******* deepfakes – and the women fighting back | Deepfake

It began with an anonymous email. “I’m genuinely so, so sorry to reach out to you,” it read. Beneath the words were three links to an internet forum. “Huge trigger warning … They contain lewd photoshopped images of you.”

Jodie (not her real name) froze. In the past, the 27-year-old from Cambridgeshire had had problems with people stealing her photos to set up dating profiles and social media accounts. She had reported it to police but been told there was nothing they could do, so pushed it to the back of her mind.

But this email, on 10 March 2021, was impossible to ignore. She clicked the links. “It was just like time stood still,” she said. “I remember letting out a huge scream. I completely broke down.”

The forum, an alternative pornographic website, contained hundreds of photos of her – on her own, on holiday, with her friends and housemates – alongside comments calling them “******” and “whores” and asking people to rate them, or fantasise about what they would do.

The person posting the pictures had also shared an invitation to other members of the forum: to use fully clothed photos of Jodie, taken from her private

This is the hidden content, please
, to create ********* explicit “deepfakes” – digitally altered content made using artificial intelligence.

“Never done this before, but would LOVE to see her faked… Happy to chat/show you more of her too… :D,” they had written. In response, users had posted their creations: hundreds of synthetic images and videos showing a woman’s body with Jodie’s face. Some featured her image in the classroom, wearing a schoolgirl outfit and being ****** by a teacher. Others showed her fully “nude”. “I was having sex in every one of them,” she said. “The shock and devastation haunts me to this day.”

The fake images – which have now been removed – are among a growing number of synthetic, ********* explicit pictures and videos being made, traded and sold online in Britain and around the world – on social media apps, in private messages and through gaming platforms, as well as on adult forums and porn sites.

Inside the helpline’s offices. Photograph: Jim Wileman/The Observer

Last week, the government announced a “crackdown” on explicit deepfakes, promising to expand the current law to make creating the images without consent a criminal offence, as well as sharing them, which has been ******** since January 2024. But soliciting deepfakes – getting someone to make them for you – isn’t set to be covered. The government is also yet to confirm whether the offence will be consent based – which campaigners say it must be – or if victims will have to prove the perpetrator had malicious intent.

At the headquarters of the Revenge Porn Helpline, in a business park on the outskirts of Exeter, Kate Worthington, 28, a senior practitioner, says stronger laws – without loopholes – are desperately needed.

The helpline, launched in 2015, is a dedicated service for victims of intimate image abuse, part-funded by the Home Office. Deepfake cases are at an all-time high: reports of synthetic image abuse have risen by 400% since 2017. But they remain small in proportion to intimate image abuse overall – there were 50 cases last year, making up about 1% of the total caseload. The main reason for this is that it is drastically under-reported, says Worthington. “A lot of the time, the victim has no idea their images have been shared.”

The team has noticed that many perpetrators of deepfake image abuse appear to be motivated by “collector culture”. “Often it’s not done with the intent of the person knowing,” says Worthington. “It’s being sold, swapped, traded for ******* gratification – or for status. If you’re the one finding this content and sharing it, alongside Snap handles, Insta handles, LinkedIn profiles, you might be glorified.” Many are made using “nudification” apps. In March, the charity that runs the revenge porn helpline reported 29 such services to Apple, which removed them.

It’s being sold, swapped, traded for ******* gratification – or for status

Kate Worthington

In other cases, synthetic images have been used to directly threaten or humiliate people. The helpline has heard cases of young boys making fake ******* images of female relatives; of men with porn addictions creating synthetic pictures of their partners performing ******* acts they did not consent to in real life; of people having pictures taken of them in the gym which were then made into deepfaked videos, to look like they were having sex. Most of those targeted – but not all – are women. About 72% of deepfake cases seen by the helpline involved women. The oldest was in her seventies.

There have also been several cases of ******* women being targeted with deepfaked images where they were wearing revealing clothing, or had their hijabs removed.

Regardless of intent, the impact is often extreme. “These photos are so realistic, often. Your colleague, neighbour, grandma isn’t going to know the difference,” Worthington says.

Senior helpline practitioner Kate Worthington. Photograph: Jim Wileman/The Observer

The Revenge Porn Helpline can help people get abusive imagery removed. Amanda Dashwood, 30, who has worked at the helpline for two years, says this is usually callers’ priority. “It’s, ‘Oh my god, please help me, I need to get this taken down before people see it,’” she says.

She and her colleagues on the helpline team – eight women, mostly aged under 30 – have various tools at their disposal. If the victim knows where content of them has been posted, the team will issue a takedown request direct to the platform. Some ignore requests altogether. But the helpline has partnerships with most of the major ones – from

This is the hidden content, please
and Snapchat to Pornhub and OnlyFans – and 90% of the time, are successful in getting it removed.

If the victim doesn’t know where content has been posted, or suspects it has been shared more widely, they will ask them to send in a selfie and run it through facial recognition technology (with their consent), or use reverse image-search tools. The tools aren’t foolproof but can detect material shared on the open web.

The team can also advise steps to stop content being posted online again. They will direct people to a service called StopNCII, a tool created with funding from Meta by SWGFL, the online safety charity under which the Revenge Porn Helpline also sits.

People can upload photos – real or synthetic – and the technology creates a unique hash, which is shared with partner platforms – including

This is the hidden content, please
,
This is the hidden content, please
, TikTok, Snapchat, Pornhub and
This is the hidden content, please
(but not X or Discord). If someone then tries to upload that image, it is automatically blocked. As of December, a million images have been hashed and 24,000 uploads pre-emptively blocked.

skip past newsletter promotion

Analysis and opinion on the week’s news and culture brought to you by the best Observer writers

Privacy Notice: Newsletters may contain info about charities, online ads, and content funded by outside parties. For more information see our Privacy Policy. We use

This is the hidden content, please
reCaptcha to protect our website and the
This is the hidden content, please
This is the hidden content, please
and
This is the hidden content, please
apply.

after newsletter promotion

Alex Woolf was convicted because of the derogatory nature of the posts, rather than for soliciting the images. Photograph: Handout

Some also go on to report it to the police, but the response varies drastically by force. Victims trying to report synthetic image abuse have been told police cannot help with edited images, or that prosecution would not be in the public interest.

Sophie Mortimer, the helpline’s manager, recalls another case where police said “no, that’s not you; that’s someone who looks like you” – and refused to investigate. “It does feel like sometimes the police look for reasons not to pursue these sorts of cases,” Mortimer says. “We know they’re difficult, but that doesn’t negate the real harm that’s being caused to people.”

In November Sam Millar, assistant police chief constable and a strategic director for Violence Against Women and Girls at the National Police Chiefs’ Council, told a parliamentary inquiry into intimate image abuse that she was “deeply worried” about officers’ lack of understanding of the legislation, and inconsistencies in cases. “Even yesterday, a victim said to me that she is in a conversation with 450 victims of deepfake imagery, but only two of them had had a positive experience of policing,” she said.

For Jodie, the need for better awareness of deepfake abuse – among the public, as well as the police – is clear.

After she was alerted to the deepfakes of her, she spent hours scrolling through the posts, trying to piece together what had happened.

She realised they had not been shared by a stranger but her close friend

This is the hidden content, please
, a Cambridge graduate and former BBC young composer of the year. He had posted a photo of her where he was cropped out. “I knew I hadn’t posted that picture on
This is the hidden content, please
and had only sent it to him. That’s when the penny dropped.”

Helpline manager Sophie Mortimer. Photograph: Jim Wileman/The Observer

After Jodie and the other women spent hours sifting through graphic material of themselves, and gave the police a USB with 60 pages of evidence, Woolf was charged.

He was subsequently convicted and given a 20-week suspended prison sentence with a rehabilitation requirement and 150 hours of unpaid work. The court ordered him to pay £100 compensation to each of the 15 victims, and to delete all the images from his devices. But the conviction – 15 counts of sending messages that were grossly offensive, indecent, obscene or menacing – related to the derogatory nature of the posts, rather than to his solicitation of the synthetic images themselves.

Jodie is highly critical of the police. “From the outset, it felt like they didn’t take the abuse seriously,” she says. She says she also faced an “uphill battle” with the forum to get the synthetic images removed.

But her biggest concern is that the law itself is lacking. Had Woolf not posted the graphic comments, he may not have been convicted. And under the law proposed by the government – based on details it has published so far – his act of soliciting fake images of Jodie would not be a specific offence.

The Ministry of Justice has said assisting someone to commit a crime is already ******** – which would cover solicitation. But Jodie said: “It needs to be watertight and ****** and white for the CPS to make a charging decision. So why would we allow this loophole to exist?

What many don’t realise is that it’s ‘normal’ people doing this

‘Jodie’

She is calling on the government to adopt another piece of legislation – a private member’s bill put forward by Baroness Owen, drawn up with campaigners, which ensures deepfake creation is consent based and includes an offence of solicitation. The call has been backed by the End Violence Against Women Coalition and charities including Refuge, as well as the Revenge Porn Helpline.

What Jodie hopes people will realise, if anything, is the “monumental impact” that deepfake abuse can have. Three years on, she speaks using a pseudonym because if she uses her real name, she risks being targeted again. Even though the original images were removed, she said she lives in “constant fear” that some might still be circulating, somewhere.

It has also affected her friendships, relationships, and her view of men overall. “For me it was the ultimate betrayal from someone that I really trusted,” she says. What many don’t realise is that it’s “normal people doing this”, she adds. It’s not “monsters or weirdos. It’s people that live among us – our colleagues, partners, friends.”



This is the hidden content, please

#love #faked #dark #world #******* #deepfakes #women #fighting #Deepfake

This is the hidden content, please

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.