Jump to content
  • Sign Up
×
×
  • Create New...

Child predators are using AI to create sexual images of their favorite ‘stars’: ‘My body will never be mine again’ | Artificial intelligence (AI)


Recommended Posts

  • Diamond Member



Child predators are using AI to create ******* images of their favorite ‘stars’: ‘My body will never be mine again’ | Artificial intelligence (AI)

Predators active on the dark web are increasingly using artificial intelligence to create ********* explicit images of children, fixating especially on “star” victims, child safety experts warn.

Child safety groups tracking the activity of predators chatting in dark web forums say they are increasingly finding conversations about creating new images based on older child ******* ****** material (CSAM). Many of these predators using AI obsess over child victims referred to as “stars” in predator communities for the popularity of their images.

“The communities of people who trade this material get infatuated with individual children,” said Sarah Gardner, chief executive officer of the Heat Initiative, a Los Angeles non-profit focused on child protection. “They want more content of those children, which AI has now allowed them to do.”

These ****** survivors may now be grown adults, but AI has exacerbated the prospect that more people may be viewing ******* content depicting them as children, according to experts and ****** survivors interviewed. They ***** that images of them circulating the internet or their communities could threaten the lives and careers they’ve built since their ****** ended.

Megan, a survivor of CSAM, whose last name is being withheld because of past violent threats, says that the potential for AI to be used to manipulate her images has become an increasingly stressful prospect over the past 12 months, though her own ****** occurred a decade ago.

“AI gives perpetrators the chance to create even more situations of my ****** to feed their own fantasies and their own versions,” she said. “The way my images could be manipulated with AI could give the false impression it was not harmful or that I was enjoying the ******.”

Since dark web browsers enable users to be anonymous or untraceable, child safety groups have few means of requesting these images be removed or reporting the users to law enforcement.

Advocates have called for legislation that goes beyond criminalization to prevent the production of CSAM, by AI and otherwise. They are pessimistic that not much can be done to enforce bans on the creation of new sexualized images of children though, now that AI enabling it has become open source and private. Encrypted messaging services, now often default options, allow predators to communicate undetected, say advocates.

Creating new CSAM and reviving old CSAM with AI

The Guardian has viewed several excerpts of these dark web chat room conversations, with the names of victims redacted for safeguarding. The discussions take an amiable tone, and forum members are encouraged to create new images with AI to share in the groups. Many said they were thrilled at the prospect of new material made with AI, others were uninterested because the images do not depict real ******.

One message from November 2023 reads: “Could you get the AI to recreate the beautiful images of former CP [child *****] stars [redacted victim name] and [redacted victim name] and get them in some scenes – like [redacted victim name] in a traditional ********* schoolgirl’s uniform at Elementary School, and [redacted victim name] in a cheerleader’s outfit at Junior High?

In another chat room conversation, predators also discussed using AI to digitally remaster decades-old popular child exploitation material of low quality.

“Wow you are awesome,” one predator wrote to another in January. “I appreciate your effort keep going upscaling classical vids.”

While predators have used photo editing software in the past, new advancements in AI models present easy-access opportunities to create more realistic ****** images of children.

Much of this activity focuses on so-called “stars”.

“In the same way there are celebrities in Hollywood, in these online communities on the dark web, there’s a celebrity-like ranking of some of the favourite victims,” said Jacques Marcoux, director of research and analytics at the ********* Centre for Child Protection. “These offender groups know them all, and they catalogue them.”

“Offenders eventually exhaust all the material of a specific victim,” said Marcoux. “So they can take an image of a victim that they like, and they can make that victim do different poses or do different things. They can nudge it with an AI model to do different poses on a bed or be in different stages of undress.”

Data bears out the phenomenon of predators’ preoccupation with “stars”. In a

This is the hidden content, please
to the National Center for Missing and Exploited Children, Meta reported that just six videos accounted for half of all the child ******* ****** material being shared and re-shared on
This is the hidden content, please
and
This is the hidden content, please
. Roughly 90% of the abusive material Meta tracked in a two-month ******* was the same as previously reported content.

Real Hollywood celebrities are also potential targets for victimization with AI-generated CSAM. The Guardian reviewed chatroom threads on the dark web discussing desires for predators who are proficient in AI to create child ****** images of celebrities, including teen idols from the 1990s who are now adults.

How child ******* ****** material made by AI spreads

Predators’ use of AI became prevalent at the end of 2022, child safety experts said. The same year as OpenAI debuted ChatGPT, the LAION-5B database, an open-source catalogue of more than 5bn images that anyone can use to train AI models, was launched by an eponymous non-profit.

A Stanford University report released in December 2023 revealed that hundreds of known images of child ******* ****** had been included in LAION-5B and are now being used to train popular AI image generation models to generate CSAM. Though the images were a minor fraction of the whole database, they carry an outsize risk, experts said.

“As soon as these things were open sourced, that’s when the production of AI generative CSAM exploded,” said Dan Sexton, chief technology officer at the Internet Watch Foundation, a ***-based non-profit that focuses on preventing online child ******.

The knowledge that real ****** images are used to train AI models has resulted in additional trauma for some survivors.

“Non-consensual images of me from when I was 14 years old can be resurrected to create new child ******* abuses images, and videos of victims around the world,” said Leah Juliett, 27, a survivor of child ******* ****** material and activist. “To know my photos can still be weaponized without my consent to harm other young children, it’s a pain and a feeling of helplessness and injustice.”

“My body will never be mine again, and that’s something that many survivors have to grapple with,” they added.

Experts say they’ve seen a shift towards predators using encrypted private messaging services such as WhatsApp, Signal and Telegram to spread and access CSAM. A great deal of CSAM is still shared outside of mainstream channels on the dark web, though. In an October 2023 report, the Internet Watch Foundation (IWF) says it found more than 20,000 AI-generated ******* images of children that were posted on just one forum on the dark web in a one-month ******* in September.

“Images show the ***** of ******* and toddlers; famous pre-teen children being ********* abused; BDSM (******** and discipline, dominance and submission, and sadomasochism); content featuring tweens and teenagers, and more,” the report states.

Over the last year, AI image generators have improved across the board, and their output has become increasingly realistic. Child safety experts said AI-generated still images are often indistinguishable from real-life photos.

“We’re seeing discussions happen where [offenders] are discussing how to fix problems, such as signs the image is fake like extra fingers. They’re coming up with solutions. The realism is getting better,” said Sexton. “There is a demand to create more images of existing victims using fine-tune models.”

What effect will AI-generated CSAM have?

Experts say the impact of AI-generated CSAM is only starting to come in focus. In certain circumstances, viewing CSAM online can cause a predator’s behavior to escalate to committing contact offences with children, and it ******** to be seen how AI plays into that dynamic.

“There are examples of men that I’ve worked with where their online behavior reinforced a ******* interest in children and led to a greater preoccupation of that sort of behavior,” said Tom Squire, head of clinical engagement at the Lucy Faithfull Foundation in the ***, a non-profit focused on preventing child ******* ******. The organization operates an anonymous helpline for anyone with a concern about child ******* ******, including their own thoughts or behaviors.

“They joined a group online where there was a currency to the sharing of images, and they wanted to contribute to that, then directly on from there they’ve gone on to ********* ****** children, and perhaps take images of that ****** and share it online,” said Squire.

Some predators mistakenly believe that viewing AI-generated CSAM may be more ethical than “real life” material, experts said.

“One of our concerns is the capacity for them to justify their behavior because these are somehow images of a victimless ****** that doesn’t involve real-world harm,” said Squire. “Some of the people who call us make an argument to minimize the gravity of what they’re doing.”

What can be done to curb AI-generated sexualized images of children?

In many countries, including the US and ***, decades-old laws already

This is the hidden content, please
any CSAM created using AI via prohibitions on any indecent or obscene visual depictions of children. Pornographic depictions of Taylor Swift made by AI and circulated early this year prompted the introduction of legislation in the US that would regulate such deepfakes.

In April, a 51-year-old US man

This is the hidden content, please
in Florida on allegations he created CSAM using AI with the face of a child he’d taken pictures of in his neighborhood. On May 20, the US Department of Justice announced the
This is the hidden content, please
in Wisconsin on ********* charges related to his alleged production, distribution and possession of more than 10,000 AI-generated images of minors engaged in ********* explicit conduct.

“We need legislative reform to ensure that ****** has no place to fester,” said Juliett. “But we also need cultural reform to stop ****** like this from happening in the first place.”

Child safety and tech experts interviewed were pessimistic on whether it is possible to prevent the production and distribution of AI-generated CSAM. They highlight that much of the production goes undetected by the authorities.

“Once it became open source, it was problematic,” said Michael Tunks, head of policy and public affairs at the Internet Watch Foundation. “Anybody can use text to image-based AI-generated tools to create any AI imagery they want.”

AI software is downloadable, which means these abusive and ******** activities can be taken offline.

“This means offenders can do it in the privacy of their own home, within the walls of their own network, therefore they’re not susceptible to getting caught doing this,” said Marcoux.





This is the hidden content, please

#Child #predators #create #******* #images #favorite #stars #body #Artificial #intelligence

This is the hidden content, please

For verified travel tips and real support, visit: https://hopzone.eu/

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.