Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • May’s Flower ‘micromoon’ will look extra small tonight, with a rare Blue Moon following
  • ‘If astrological compatibility exists, its effects should be observable’: TL;DR — it’s not
  • Humanoid robots have outpaced human runners in the half-marathon, beating the world record ‪—‬ here are the secrets to this astonishing feat
  • Google AI breakthrough means chatbots use six times less memory during conversations without compromising performance
  • ZWO Seestar S30 Pro smart telescope review
  • Doctors partially delivered a baby at 25 weeks to perform a lifesaving surgery and then returned him to the womb
  • Poop-encrusted chamber pots from the Roman Empire reveal oldest known human cases of Crypto parasite
  • Weapons of the world quiz: Can you identify these historical objects of war?
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Tech»Meta announces temporary chatbot updates to protect teen users
Tech

Meta announces temporary chatbot updates to protect teen users

EditorBy EditorAugust 30, 2025No Comments3 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

Meta is instituting interim safety changes to ensure the company’s chatbots don’t cause additional harm to teen users, as AI companies face a wave of criticism for their allegedly lax safety protocols.

In an exclusive with TechCrunch, Meta spokesperson Stephanie Otway told the publication that the company’s AI chatbots were now being trained to no longer “engage with teenage users on self-harm, suicide, disordered eating, or potentially inappropriate romantic conversations.” Previously, chatbots had been allowed to broach such topics when “appropriate.”

SEE ALSO:

4 reasons not to turn ChatGPT into your therapist

Meta will also only allow teen accounts to utilize a select group of AI characters — ones that “promote education and creativity” — ahead of a more robust safety overhaul in the future.

Earlier this month, Reuters reported that some of Meta’s chatbot policies, per internal documents, allowed avatars to “engage a child in conversations that are romantic or sensual.” Reuters published another report today, detailing both user- and employee-created AI avatars that donned the names and likenesses of celebrities like Taylor Swift and engaged in “flirty” behavior, including sexual advances. Some of the chatbots used personas of child celebrities, as well. Others were able to generate sexually suggestive images.

Mashable Light Speed

Meta spokesman Andy Stone told the publication the chatbots should not have been able to engage in such behavior, but that celebrity-inspired avatars were not outrightly banned if they were labeled as parody. Around a dozen of the avatars have since been removed.

OpenAI recently announced additional safety measures and behavioral prompts for the latest GPT-5, following the filing of a wrongful death lawsuit by parents of a teen who died by suicide after confiding in ChatGPT. Prior to the lawsuit, OpenAI announced new mental health features intended to curb “unhealthy” behaviors among users. Anthropic, makers of Claude, recently introduced new updates to the chatbot allowing it to end chats deemed harmful or abusive. Character.AI, a company hosting increasingly popular AI companions despite reported unhealthy interactions with teen visitors, introduced parental supervision features in March.

This week, a group of 44 attorneys general sent a letter to leading AI companies, including Meta, demanding stronger protections for minors who may come across sexualized AI content. Broadly, experts have expressed growing concern about the impact of AI companions on young users, as their use grows among teens.

Don’t miss out on our latest stories: Add Mashable as a trusted news source in Google.

Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleNewcastle still searching for first win after frustrating draw at Leeds
Next Article Fox News Entertainment Newsletter: Royal experts on Meghan Markle’s ‘pity party’; George Clooney skips event
Editor
  • Website

Related Posts

Tech

iPhone exploit DarkSword has been released in the wild

March 24, 2026
Tech

The U.S. router ban: Everything you need to know

March 24, 2026
Tech

Underage sexual content, self-harm info targeted by OpenAI’s new open-source prompts

March 24, 2026
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • May’s Flower ‘micromoon’ will look extra small tonight, with a rare Blue Moon following
  • ‘If astrological compatibility exists, its effects should be observable’: TL;DR — it’s not
  • Humanoid robots have outpaced human runners in the half-marathon, beating the world record ‪—‬ here are the secrets to this astonishing feat
  • Google AI breakthrough means chatbots use six times less memory during conversations without compromising performance
  • ZWO Seestar S30 Pro smart telescope review
calendar
May 2026
M T W T F S S
 123
45678910
11121314151617
18192021222324
25262728293031
« Apr    
Recent Posts
  • May’s Flower ‘micromoon’ will look extra small tonight, with a rare Blue Moon following
  • ‘If astrological compatibility exists, its effects should be observable’: TL;DR — it’s not
  • Humanoid robots have outpaced human runners in the half-marathon, beating the world record ‪—‬ here are the secrets to this astonishing feat
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2026 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.