Close Menu
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Facebook X (Twitter) WhatsApp
Trending
  • ‘Trash’ found deep inside a Mexican cave turns out to be 500-year-old artifacts from a little-known culture
  • Powerful Mother’s Day geomagnetic storm created radio-disrupting bubbles in Earth’s upper atmosphere
  • ‘The Martian’ predicts human colonies on Mars by 2035. How close are we?
  • Ram in the Thicket: A 4,500-year-old gold statue from the royal cemetery at Ur representing an ancient sunrise ritual
  • How much of your disease risk is genetic? It’s complicated.
  • Black holes: Facts about the darkest objects in the universe
  • Does light lose energy as it crosses the universe? The answer involves time dilation.
  • US Representatives worry Trump’s NASA budget plan will make it harder to track dangerous asteroids
Get Your Free Email Account
Facebook X (Twitter) WhatsApp
Baynard Media
  • Home
  • UNSUBSCRIBE
  • News
  • Lifestyle
  • Tech
  • Entertainment
  • Sports
  • Travel
Baynard Media
Home»Tech»Apple sued for a billion dollars over alleged failure to block child sex abuse materials
Tech

Apple sued for a billion dollars over alleged failure to block child sex abuse materials

EditorBy EditorDecember 10, 2024No Comments3 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
Share
Facebook Twitter LinkedIn Pinterest Email

Apple is once again facing a billion dollar lawsuit, as thousands of victims come out against the company for its alleged complicity in spreading child sex abuse materials (CSAM).

In a lawsuit filed Dec. 7, the tech giant is accused of reneging on mandatory reporting duties — which require U.S.-based tech companies to report instances of CSAM to the National Center for Missing & Exploited Children (NCMEC) — and allowing CSAM to proliferate. In failing to institute promised safety mechanisms, the lawsuit claims, Apple has sold “defective products” to specific classes of customers (CSAM victims).

Some of the plaintiffs argue they have been continuously re-traumatized by the spread of content long after they were children, as Apple has chosen to focus on preventing new cases of CSAM and the grooming of young users.

“Thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet. Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims,” wrote lawyer Margaret E. Mabie.

Mashable Light Speed

SEE ALSO:

‘Abysmal’ working conditions, exploitation of webcam models exposed

The company has retained tight control over its iCloud product and user libraries as part of its wider privacy promises. In 2022, Apple scrapped its plans for a controversial tool that would automatically scan and flag iCloud photo libraries for abusive or problematic material, including CSAM. The company cited growing concern over user privacy and mass surveillance by Big Tech in its choice to no longer introduce the scanning feature, and Apple’s choice was widely supported by privacy groups and activists around the world. But the new lawsuit argues that the tech giant merely used this cybersecurity defense to skirt its reporting duties.

“Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk,” wrote Apple spokesperson Fred Sainz in response to the lawsuit. “We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts.”

Tech companies have struggled to control the spread of abusive material online. A 2024 report by UK watchdog National Society for the Prevention of Cruelty to Children (NSPCC) accused Apple of vastly underreporting the amount of CSAM shared across its products, with the company submitting just 267 worldwide reports of CSAM to NCMEC in 2023. Competitors Google and Meta reported more than 1 million and 30 million cases, respectively. Meanwhile, growing concern over the rise of digitally-altered or synthetic CSAM has complicated the regulatory landscape, leaving tech giants and social media platforms racing to catch up.

While Apple faces a potential billion-dollar lawsuit should the suit move to and be favored by a jury, the decision has even wider repercussions for the industry and privacy efforts at large. The court could decide to force Apple into reviving its photo library scanning tool or implement other industry features to remove abusive content, paving a more direct path toward government surveillance and wielding another blow to Section 230 protections.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous ArticleAsteroid news, features and articles
Next Article Airline passenger shares rare move made by gate agent who rearranged seat assignments on flight
Editor
  • Website

Related Posts

Tech

TCL 75-inch Nxtframe TV Presidents’ Day Deal: $1000 off at Best Buy

February 17, 2025
Tech

Best tech deal: The Creator Edition of the GoPro Hero13 Black is down to $499.99 at Best Buy

February 17, 2025
Tech

Reddit paywalls to hit this year as paid subreddits confirmed

February 17, 2025
Add A Comment

Comments are closed.

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Recent Posts
  • ‘Trash’ found deep inside a Mexican cave turns out to be 500-year-old artifacts from a little-known culture
  • Powerful Mother’s Day geomagnetic storm created radio-disrupting bubbles in Earth’s upper atmosphere
  • ‘The Martian’ predicts human colonies on Mars by 2035. How close are we?
  • Ram in the Thicket: A 4,500-year-old gold statue from the royal cemetery at Ur representing an ancient sunrise ritual
  • How much of your disease risk is genetic? It’s complicated.
calendar
June 2025
M T W T F S S
 1
2345678
9101112131415
16171819202122
23242526272829
30  
« May    
Recent Posts
  • ‘Trash’ found deep inside a Mexican cave turns out to be 500-year-old artifacts from a little-known culture
  • Powerful Mother’s Day geomagnetic storm created radio-disrupting bubbles in Earth’s upper atmosphere
  • ‘The Martian’ predicts human colonies on Mars by 2035. How close are we?
About

Welcome to Baynard Media, your trusted source for a diverse range of news and insights. We are committed to delivering timely, reliable, and thought-provoking content that keeps you informed
and inspired

Categories
  • Entertainment
  • Lifestyle
  • News
  • Sports
  • Tech
  • Travel
Facebook X (Twitter) Pinterest WhatsApp
  • Contact Us
  • About Us
  • Privacy Policy
  • Disclaimer
  • UNSUBSCRIBE
© 2025 copyrights reserved

Type above and press Enter to search. Press Esc to cancel.