Jump to content
  • Sign Up
×
×
  • Create New...

San Francisco goes after websites that make AI deepfake nudes of women and girls


Recommended Posts

  • Diamond Member

This is the hidden content, please

San Francisco goes after websites that make AI deepfake nudes of women and ******

Nearly a year after AI-generated ***** images of high school ****** upended a community in southern Spain, a juvenile court this summer sentenced 15 of their classmates to a year of probation.

But the

This is the hidden content, please
tool used to create the
This is the hidden content, please
is still easily accessible on the internet, promising to “undress any photo” uploaded to the website within seconds.

Now a new effort to shut down the app and others like it is being pursued in California, where San Francisco this week filed a first-of-its-kind lawsuit that experts say could set a precedent but will also face many hurdles.

“The proliferation of these images has exploited a shocking number of women and ****** across the globe,” said David Chiu, the elected city attorney of San Francisco who brought the case against a group of widely visited websites tied to entities in California, New Mexico, Estonia, Serbia, the ******* Kingdom and elsewhere.

“These images are used to bully, humiliate and threaten women and ******,” he said in an interview with The Associated Press. “And the impact on the victims has been devastating on their reputation, mental health, loss of autonomy, and in some instances, causing some to become suicidal.”

The lawsuit brought on behalf of the people of California alleges that the services broke numerous state laws against fraudulent business practices, nonconsensual ************ and the ******* ****** of children. But it can be hard to determine who runs the apps, which are unavailable in phone app stores but still easily found on the internet.

Contacted late last year by the AP, one service claimed by email that its “CEO is based and moves throughout the USA” but declined to provide any evidence or answer other questions. The AP is not naming the specific apps being sued in order to not promote them.

“There are a number of sites where we don’t know at this moment exactly who these operators are and where they’re operating from, but we have investigative tools and subpoena authority to dig into that,” Chiu said. “And we will certainly utilize our powers in the course of this litigation.”

Many of the tools are being used to create realistic fakes that “nudify” photos of clothed ****** women, including celebrities, without their consent. But they have also popped up in schools around the world, from Australia to Beverly Hills in California, typically with boys creating the images of female classmates that then circulate through social media.

In one of the first widely publicized cases last September in Almendralejo, Spain, a physician who helped bring it to the public’s attention after her daughter was among the victims said she is satisfied by the severity of the sentence their classmates are facing after a court decision earlier this summer.

But it is “not only the responsibility of society, of education, of parents and schools, but also the responsibility of the digital giants that profit from all this garbage,” Dr. Miriam Al Adib Mendiri said in an interview Friday.

She applauded San Francisco’s action but said more efforts are needed, including from ******* companies like

This is the hidden content, please
and its subsidiary WhatsApp, which was used to circulate the images in Spain.

While schools and law enforcement agencies have sought to punish those who make and share the deepfakes, authorities have struggled with what to do about the tools themselves.

In January, the executive branch of the ********* Union explained in a letter to a Spanish member of the ********* Parliament that the app used in Almendralejo “does not appear” to fall under the bloc’s sweeping

This is the hidden content, please
because it is not a big enough platform.

Organizations that have been tracking the growth of AI-generated child ******* ****** material will be closely following the San Francisco case.

The lawsuit “has the potential to set legal precedent in this area,” said Emily Slifer, the director of policy at Thorn, an organization that works to combat the ******* exploitation of children.

A researcher at Stanford University said that because so many of the defendants are based outside the U.S., it will be ******* to bring them to justice.

Chiu “has an uphill battle with this case, but may be able to get some of the sites taken offline if the defendants running them ignore the lawsuit,” said Stanford’s Riana Pfefferkorn.

She said that could happen if the city wins by default in their absence and obtains orders affecting domain-name registrars, web hosts and payment processors “that would effectively shutter those sites even if their owners never appear in the litigation.”



This is the hidden content, please

#San #Francisco #websites #deepfake #nudes #women #******

This is the hidden content, please

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.