Jump to content
  • Sign Up
×
×
  • Create New...

In lawsuit over teen’s death, judge rejects arguments that AI chatbots have free speech rights


Recommended Posts

  • Diamond Member

This is the hidden content, please

In lawsuit over teen’s death, judge rejects arguments that AI chatbots have free speech rights

TALLAHASSEE, Fla. (AP) — A federal judge on Wednesday rejected arguments made by an

This is the hidden content, please
company that its chatbots are protected by the First Amendment — at least for now. The developers behind Character.AI are seeking to dismiss a lawsuit alleging the company’s chatbots pushed a teenage boy to kill himself.

The judge’s order will allow the

This is the hidden content, please
to proceed, in what legal experts say is among the latest constitutional tests of
This is the hidden content, please
.

The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character.AI chatbot that pulled him into what she described as an emotionally and ********* abusive relationship that led to his suicide.

Meetali Jain of the Tech Justice Law Project, one of the attorneys for Garcia, said the judge’s order sends a message that Silicon Valley “needs to stop and think and impose guardrails before it launches products to market.”

The suit against Character Technologies, the company behind Character.AI, also names individual developers and

This is the hidden content, please
as defendants. It has drawn the attention of legal experts and AI watchers in the U.S. and beyond, as the technology rapidly
This is the hidden content, please
, marketplaces and
This is the hidden content, please
despite what
This is the hidden content, please
are potentially
This is the hidden content, please
.

“The order certainly sets it up as a potential test case for some broader issues involving AI,” said Lyrissa Barnett Lidsky, a law professor at the University of Florida with a focus on the First Amendment and artificial intelligence.

The lawsuit alleges that in the final months of his life, Setzer became increasingly isolated from reality as he engaged in sexualized conversations with the bot, which was patterned after a fictional character from the television show “Game of Thrones.” In his final moments, the bot told Setzer it loved him and urged the teen to “come home to me as soon as possible,” according to screenshots of the exchanges. Moments after receiving the message, Setzer shot himself, according to legal filings.

In a statement, a spokesperson for Character.AI pointed to a number of safety features the company has implemented, including guardrails for children and suicide prevention resources that were announced the day the lawsuit was filed.

“We care deeply about the safety of our users and our goal is to provide a space that is engaging and safe,” the statement said.

Attorneys for the developers want the case dismissed because they say chatbots deserve First Amendment protections, and ruling otherwise could have a “chilling effect” on the AI industry.

In her order Wednesday, U.S. Senior District Judge Anne Conway rejected some of the defendants’ free speech claims, saying she’s “not prepared” to hold that the chatbots’ output constitutes speech “at this stage.”

Conway did find that Character Technologies can assert the First Amendment rights of its users, who she found have a right to receive the “speech” of the chatbots. She also determined Garcia can move forward with claims that

This is the hidden content, please
can be held liable for its alleged role in helping develop Character.AI. Some of the founders of the platform had previously worked on building AI at
This is the hidden content, please
, and the suit says the tech giant was “aware of the risks” of the technology.

“We strongly disagree with this decision,” said

This is the hidden content, please
spokesperson José Castañeda. “
This is the hidden content, please
and Character AI are entirely separate, and
This is the hidden content, please
did not create, design, or manage Character AI’s app or any component part of it.”

No matter how the lawsuit plays out, Lidsky says the case is a warning of “the dangers of entrusting our emotional and mental health to AI companies.”

“It’s a warning to parents that social media and generative AI devices are not always harmless,” she said.

___

EDITOR’S NOTE — If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.

___ Kate Payne is a corps member for The Associated Press/Report for America Statehouse News Initiative.

This is the hidden content, please
is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.



This is the hidden content, please

#lawsuit #teens #death #judge #rejects #arguments #chatbots #free #speech #rights

This is the hidden content, please

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.