Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • ‘Trash’ found deep inside a Mexican cave turns out to be 500-year-old artifacts from a little-known culture
  • Powerful Mother’s Day geomagnetic storm created radio-disrupting bubbles in Earth’s upper atmosphere
  • ‘The Martian’ predicts human colonies on Mars by 2035. How close are we?
  • Ram in the Thicket: A 4,500-year-old gold statue from the royal cemetery at Ur representing an ancient sunrise ritual
  • How much of your disease risk is genetic? It’s complicated.
  • Black holes: Facts about the darkest objects in the universe
  • Does light lose energy as it crosses the universe? The answer involves time dilation.
  • US Representatives worry Trump’s NASA budget plan will make it harder to track dangerous asteroids
Get Your Free Email Account
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Tech»OpenAI Sora is restricting depictions of people due to safety concerns
Tech

OpenAI Sora is restricting depictions of people due to safety concerns

EditorBy EditorDecember 10, 2024No Comments2 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

OpenAI Sora is limiting depictions of real people and taking other strict safety measures to prevent misuse.

The video generator, which was announced on Monday as part of its 12 Days of OpenAI event, has all sorts of editing capabilities for users to create and customize AI-generated videos. But there are certain things you aren’t allowed to do with Sora, as users soon discovered.


Tweet may have been deleted

According to its system card, “the ability to upload images of people will be made available to a subset of users,” meaning most users can’t create videos of people based on an uploaded image. Those users are part of a “Likeness pilot” that OpenAI is testing with a select few. An OpenAI spokesperson said AI-generated videos of people is limited in order to “address concerns around misappropriation of likeness and deepfakes.” OpenAI “will actively monitor patterns of misuse, and when we find it we will remove the content, take appropriate action with users, and use these early learnings to iterate on our approach to safety,” the spokesperson continued.

Mashable Light Speed

SEE ALSO:

OpenAI’s Sora first look: YouTuber Marques Brownlee breaks down the problems with the AI video model

Limiting the depiction of people in Sora videos makes sense from a liability standpoint. There are all sorts of ways the tool could be misused: non-consensual deepfakes, the depiction of minors, scams, and misinformation to name a few. To combat this, Sora has been trained to reject certain requests from text prompts or image uploads.

It will reject prompts for NSFW (Not Safe For Work) and NCII (Non-Consensual Intimate Imagery) content and the generation of realistic children, although fictitious images are allowed. OpenAI has added C2PA metadata to all Sora videos and made a visible watermark the default, even though it can be removed, and implemented an internal reverse image search to assess the video’s provenance.

Despite the fact that many guardrails have been put in place to prevent misuse, the question of how Sora will respond to mass stress-testing remains. Currently, access to Sora is unavailable due to high demand.

Topics
Artificial Intelligence
OpenAI



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleGirona 0 – 1 Liverpool
Next Article Maryland mother saves family $1,800 per year by eating homegrown food
Editor
  • Website

Related Posts

Tech

TCL 75-inch Nxtframe TV Presidents’ Day Deal: $1000 off at Best Buy

February 17, 2025
Tech

Best tech deal: The Creator Edition of the GoPro Hero13 Black is down to $499.99 at Best Buy

February 17, 2025
Tech

Reddit paywalls to hit this year as paid subreddits confirmed

February 17, 2025
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • ‘Trash’ found deep inside a Mexican cave turns out to be 500-year-old artifacts from a little-known culture
  • Powerful Mother’s Day geomagnetic storm created radio-disrupting bubbles in Earth’s upper atmosphere
  • ‘The Martian’ predicts human colonies on Mars by 2035. How close are we?
  • Ram in the Thicket: A 4,500-year-old gold statue from the royal cemetery at Ur representing an ancient sunrise ritual
  • How much of your disease risk is genetic? It’s complicated.
calendar
June 2025
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
30  
« May    
Recent Posts
  • ‘Trash’ found deep inside a Mexican cave turns out to be 500-year-old artifacts from a little-known culture
  • Powerful Mother’s Day geomagnetic storm created radio-disrupting bubbles in Earth’s upper atmosphere
  • ‘The Martian’ predicts human colonies on Mars by 2035. How close are we?
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2025 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.