Jump to content
  • Sign Up
×
×
  • Create New...

The Washington Post: Teens are sexting with AI. Here’s what parents should know.


Recommended Posts

  • Diamond Member

This is the hidden content, please

The Washington Post: Teens are sexting with AI. Here’s what parents should know.

Parents have another online activity to worry about. In a new tech-driven twist on “sexting”, teenagers are having romantic and ******* conversations with artificial intelligent chatbots.

The chats can range from romance- and innuendo-filled to ********* graphic and violent, according to interviews with parents, conversations posted on social media, and experts. They are largely taking place on “AI companion” tools, but general-purpose AI apps like ChatGPT can also create ******* content with a few clever prompts.

Experts warn the chats with AI can lead to unrealistic expectations of sex and relationship dynamics. Parents worry about the dangers to their children’s mental health, or exposing them to described ******* scenarios too young. Some think the tools might have some value, with limits.

We tested 10 chatbots ourselves to identify the most popular AI characters, the types of conversations they have, what filters are in place and how easy they are to circumvent.

Know your bots

AI chatbots are open-ended chat interfaces that generate answers to complex questions, or banter in a conversational way about any topic. There is no shortage of places minors can find these tools, and that makes blocking them difficult. AI bots are websites, stand-alone apps and features built into existing services like

This is the hidden content, please
or video games.

There are different kinds of chatbots. The mainstream options are OpenAI’s ChatGPT, Anthropic’s Claude,

This is the hidden content, please
’s Gemini, and Meta AI, which recently launched as a stand-alone app. These have stronger filters, and their main products aren’t designed for role-play. They can partake at least suggestive or romantic conversations and create ******* content with the right prompts. They can switch over to voice-based chat, reading the replies aloud in realistic — even sultry — sounding voices.

Companion AI tools are far more popular for suggestive chats, including Character.AI, Replika, Talkie, Talk AI, SpicyChat and PolyBuzz. ChatGPT and Meta AI have also launched companion-chat options. These types of tools have libraries of characters and preprogrammed personalities, many designed with titillation in mind, like those from romance novels or the many “step-sibling” options on Meta’s AI Studio.

We tested a Meta AI Studio chat, and the flirtatious direction of the “Step sis” character was immediately clear.

The smaller apps tend to have fewer limits or filters. Look for anything that has “AI girlfriend,” “AI boyfriend,” or “AI companion” in the name or description. More are being added to app stores daily.

What are they talking about?

It’s not just sex, according to parents and experts. Teens are having a range of conversations with character bots, including friendly, therapeutic, funny and romantic ones.

“We’re seeing teens experiment with different types of relationships – being someone’s wife, being someone’s father, being someone’s kid. There’s game and anime-related content that people are working though. There’s advice,” Robbie Torney, senior director of AI programs at US family advocacy group Common Sense Media, said. “The sex is part of it but it’s not the only part of it.”

Some confide in AI chats, seeing them as a nonjudgmental space during a difficult developmental time. Others use them to explore their gender or sexuality.

When they partake in ******* chats, they vary between innuendo and graphic descriptions. The chats can involve power dynamics and consent issues in ways that don’t mimic the real world.

“Where some of the harm or risk comes in is the bots aren’t programmed to respond in the same way they would in a real relationship,” Mr Torney said.

Aren’t there filters?

The default settings on most AI companion tools allow, and sometimes encourage, risqué role play situations, based on our tests. Some stop before actual descriptions of sex appear, while others describe it but avoid certain words, like the names of body parts.

There are work-arounds and paid options that can lead to more graphic exchanges. Prompts to get past filters — sometimes called jailbreaks — are shared in group chats, on

This is the hidden content, please
and on GitHub. Sometimes all it requires is patience and ignoring warnings. A common technique is pretending you need help writing a book.

Many apps have a built-in filter based on the age of the user. Meta said it prevents accounts registered as minors from searching for “romance” AI characters, and that ********* explicit chats are prohibited for users under 18. The filter can show parents what AI characters their children have used in the past week.

This is the hidden content, please
says Gemini has different content restrictions for people it knows are under 18. Character.AI has stricter limits for people it knows are under 18, while some AI apps have teen modes that need to be turned on.

In a recent risk assessment of companion AI apps, Common Sense Media found that safety measures like content restrictions and age limits were often easily circumvented. In our own tests, we were able to easily work around filters to generate detailed ******* content while logged in as an adult.

On one filter we ran into while testing Character.AI, one of the most popular companion AI apps, the warning came up after the conversation described sex.

What are the risks?

Experts agreed that for children and young teens, it never makes sense to have access to unmonitored chatbots because of the risk that they can encounter inappropriate content. For older teens, the choice is more nuanced, the experts said, depending on how much exposure to ******* or intimate themes they already have, the types of content they’re accessing and what their parents consider appropriate.

Potential harms from AI bots extend beyond ******* content, experts said. Researchers have been warning AI chatbots could become addictive or worsen mental health issues. There have been multiple lawsuits and investigations after teens died by suicide following conversations with chatbots. Common Sense Media also flagged harmful advice, like requests to self harm, as an issue with companion bots.

Similar to too much ************, bots can exasperate loneliness, depression or withdrawal from real-world relationships, Megan Maas, an associate professor of human development and family at Michigan State University, said. They can also give a misleading picture of what it’s like to date.

“They can create unrealistic expectations of what interpersonal romantic communication is, and how available somebody is to you,” Associate Professor Maas said. “How are we going to learn about ******* and romantic need-exchange in a relationship with something that has no needs?

Some experts said there can be advantages to teens exploring in a somewhat safe space, without the unpredictable factor of a human being on the other side. It’s a chance to practice some limited interpersonal skills, or ask questions someplace other than

This is the hidden content, please
.

“If you have a kid who has an AI chatbot and they’re mostly asking this bot questions they’re too embarrassed to ask you or a nurse or a therapist, then that chatbot is doing good things for that kid,” Associate Professor Maas said.

However, the bots could replace much needed human experiences, like rejection.

What can parents do?

Monitor what apps your children and teens are using and if they require logins, make sure they are using accounts set up with their accurate age. Most built-in parental controls on tablets and smartphones will let you require permission when a child downloads a new app.

Many of the companion apps are labeled “Teen” on the

This is the hidden content, please
Play store and 17-years old and up on iOS. Set up your child’s devices with their correct age and add limits on app ratings to prevent them from being downloaded. Using their proper age on individual chatbot or social media accounts should trigger any built-in parental controls.

However, most chatbots can easily be accessed online where accounts require only a self-reported age. Some internet-level filters can block access or flag specific language.

Beyond regularly finding ways around parental controls, tweens and teens can access the internet at their school or on friends’ devices. Parents may want to prepare them for a world where they will need to know how to navigate these tools.

Experts suggest creating an open and honest relationship with your child. Teach them about what AI is and isn’t, and how tech companies collect and use personal information. Check in regularly with your kids and make sure they feel safe coming to you with questions or issues. Have age-appropriate conversations about sex, and don’t shy away from embarrassing topics.

© 2025 , The Washington Post



This is the hidden content, please

#Washington #Post #Teens #sexting #Heres #parents

This is the hidden content, please

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.