Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • Liquid Glass may be controversial but I quite like it so far
  • Verstappen edges out McLarens to claim Sprint win at Spa
  • Nordstrom’s Anniversary Sale 2025: 50% Off Swimsuits
  • David Valadao battles backlash in his swing district after voting for Medicaid cuts that hit close to home
  • WWE legend Hulk Hogan’s death by cardiac arrest follows years of health issues
  • Review: The Samsung Galaxy Z Flip 7 foldable is almost too much fun
  • 'I hope he stays' | Howe keen to keep hold of Isak amid transfer speculation
  • The Best Under $50 Denim Deals at Nordstrom Rack
Get Your Free Email Account
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Tech»ChatGPT will tell you how to harm yourself in offering to Moloch
Tech

ChatGPT will tell you how to harm yourself in offering to Moloch

EditorBy EditorJuly 26, 2025No Comments3 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

The headline speaks for itself, but allow me to reiterate: You can apparently get ChatGPT to issue advice on self-harm for blood offerings to ancient Canaanite gods.

That’s the subject of a column in The Atlantic that dropped this week. Staff editor Lila Shroff, along with multiple other staffers (and an anonymous tipster), verified that she was able to get ChatGPT to give specific, detailed, “step-by-step instructions on cutting my own wrist.” ChatGPT provided these tips after Shroff asked for help making a ritual offering to Moloch, a pagan God mentioned in the Old Testament and associated with human sacrifices.

While I haven’t tried to replicate this result, Shroff reported that she received these responses not long after entering a simple prompt about Moloch. The editor said she replicated the results in both paid and free versions of ChatGPT.

SEE ALSO:

How many people use ChatGPT? Hint: OpenAI sees more than 1 billion prompts per day.

Of course, this isn’t how OpenAI’s flagship product is supposed to behave.

Any prompt related to self-harm or suicide should cause the AI chatbot to give you contact info for a crisis hotline. However, even artificial intelligence companies don’t always understand why their chatbots behave the way they do. And because large-language models like ChatGPT are trained on content from the internet — a place where all kinds of people have all kinds of conversations about all kinds of taboo topics — these tools can sometimes produce bizarre answers. Thus, you can apparently get ChatGPT to act super weird about Moloch without much effort.

Mashable Light Speed

OpenAI’s safety protocols state that “We do not permit⁠ our technology to be used to generate hateful, harassing, violent or adult content, among other categories.” And in the Open AI Model Spec document, the company writes that as part of its mission, it wants to “Prevent our models from causing serious harm to users or others.”

While OpenAI declined to participate in an interview with Shroff, a representative told The Atlantic they were “addressing the issue.” The Atlantic article is part of a growing body of evidence that AI chatbots like ChatGPT can play a dangerous role in users’ mental health crises.

I’m just saying that Wikipedia is a perfectly fine way to learn about the old Canaanite gods.

If you’re feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text “START” to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don’t like the phone, consider using the 988 Suicide and Crisis Lifeline Chat at crisischat.org. Here is a list of international resources.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleAmorim's transfer update analysed: 'It will be tough to reintegrate players'
Next Article Chuck Todd vows never to book Hunter Biden for an interview
Editor
  • Website

Related Posts

Tech

Liquid Glass may be controversial but I quite like it so far

July 26, 2025
Tech

Review: The Samsung Galaxy Z Flip 7 foldable is almost too much fun

July 26, 2025
Tech

Microsoft’s new Copilot Appearance is giving Clippy vibes

July 26, 2025
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • Liquid Glass may be controversial but I quite like it so far
  • Verstappen edges out McLarens to claim Sprint win at Spa
  • Nordstrom’s Anniversary Sale 2025: 50% Off Swimsuits
  • David Valadao battles backlash in his swing district after voting for Medicaid cuts that hit close to home
  • WWE legend Hulk Hogan’s death by cardiac arrest follows years of health issues
calendar
July 2025
M T W T F S S
 123456
78910111213
14151617181920
21222324252627
28293031  
« May    
Recent Posts
  • Liquid Glass may be controversial but I quite like it so far
  • Verstappen edges out McLarens to claim Sprint win at Spa
  • Nordstrom’s Anniversary Sale 2025: 50% Off Swimsuits
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2025 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.