#1 Trusted Cybersecurity News Platform Followed by 4.50+ million
The Hacker News Logo
Get the Free Newsletter
CrowdSec

CSAM Detection | Breaking Cybersecurity News | The Hacker News

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse
May 12, 2022
The European Commission on Wednesday proposed new regulation that would require tech companies to scan for child sexual abuse material (CSAM) and grooming behavior, raising worries that it could undermine end-to-end encryption (E2EE). To that end, online service providers, including hosting services and communication apps, are expected to proactively scan their platforms for CSAM as well as report, remove and disable access to such illicit content. While instant messaging services like WhatsApp  already   rely  on hashed versions of known CSAM to automatically block new uploads of images or videos matching them, the new plan requires such platforms to identify and flag new instances of CSAM. "Detection technologies must only be used for the purpose of detecting child sexual abuse," the regulator  said . "Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rat

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash
Sep 04, 2021
Apple is temporarily hitting the pause button on its  controversial plans  to screen users' devices for child sexual abuse material (CSAM) after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the iPhone maker  said  in a statement on its website. The announcement, however, doesn't make it clear as to the kind of inputs it would be gathering, the nature of changes it aims to devise, or how it intends to implement the system in a way that mitigates the privacy and security concerns that could arise once it's deployed. The changes were originally slated to go live with iOS 15 and macOS Monterey later this year, starting with the U.S. In

external linkThe Latest SaaS Security Information Resource

SaaS
websiteSaaS Security on TapSaaS Security
Discover SaaS Security on Tap, a video series bringing you all the ins and outs of securing your SaaS stack. Watch now.

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy
Aug 06, 2021
Apple on Thursday said it's introducing new child safety features in iOS, iPadOS, watchOS, and macOS as part of its efforts to limit the spread of Child Sexual Abuse Material (CSAM) in the U.S. To that effect, the iPhone maker said it intends to begin client-side scanning of images shared via every Apple device for known child abuse content as they are being uploaded into iCloud Photos, in addition to leveraging on-device machine learning to vet all iMessage images sent or received by minor accounts (aged under 13) to warn parents of sexually explicit photos shared over the messaging platform. Furthermore, Apple also plans to update Siri and Search to stage an intervention when users try to perform searches for CSAM-related topics, alerting that the "interest in this topic is harmful and problematic." "Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit," Apple  noted . "The feature is desi
Cybersecurity Resources