#1 Trusted Cybersecurity News Platform Followed by 4.50+ million
The Hacker News Logo
Get the Free Newsletter
SaaS Security

CSAM Detection | Breaking Cybersecurity News | The Hacker News

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise
Oct 10, 2023 Cybersecurity / Online Security
Certain online risks to children are on the rise, according to a recent report from Thorn , a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the  Emerging Online Trends in Child Sexual Abuse 2023 report , indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults. "In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives," said John Starr, VP of Strategic Impact at Thorn. "Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content." These trends and others shared in the Emerging O

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse
May 12, 2022
The European Commission on Wednesday proposed new regulation that would require tech companies to scan for child sexual abuse material (CSAM) and grooming behavior, raising worries that it could undermine end-to-end encryption (E2EE). To that end, online service providers, including hosting services and communication apps, are expected to proactively scan their platforms for CSAM as well as report, remove and disable access to such illicit content. While instant messaging services like WhatsApp  already   rely  on hashed versions of known CSAM to automatically block new uploads of images or videos matching them, the new plan requires such platforms to identify and flag new instances of CSAM. "Detection technologies must only be used for the purpose of detecting child sexual abuse," the regulator  said . "Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rat

How to Accelerate Vendor Risk Assessments in the Age of SaaS Sprawl

How to Accelerate Vendor Risk Assessments in the Age of SaaS Sprawl
Mar 21, 2024SaaS Security / Endpoint Security
In today's digital-first business environment dominated by SaaS applications, organizations increasingly depend on third-party vendors for essential cloud services and software solutions. As more vendors and services are added to the mix, the complexity and potential vulnerabilities within the  SaaS supply chain  snowball quickly. That's why effective vendor risk management (VRM) is a critical strategy in identifying, assessing, and mitigating risks to protect organizational assets and data integrity. Meanwhile, common approaches to vendor risk assessments are too slow and static for the modern world of SaaS. Most organizations have simply adapted their legacy evaluation techniques for on-premise software to apply to SaaS providers. This not only creates massive bottlenecks, but also causes organizations to inadvertently accept far too much risk. To effectively adapt to the realities of modern work, two major aspects need to change: the timeline of initial assessment must shorte

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash
Sep 04, 2021
Apple is temporarily hitting the pause button on its  controversial plans  to screen users' devices for child sexual abuse material (CSAM) after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the iPhone maker  said  in a statement on its website. The announcement, however, doesn't make it clear as to the kind of inputs it would be gathering, the nature of changes it aims to devise, or how it intends to implement the system in a way that mitigates the privacy and security concerns that could arise once it's deployed. The changes were originally slated to go live with iOS 15 and macOS Monterey later this year, starting with the U.S. In

Automated remediation solutions are crucial for security

cyber security
websiteWing SecurityShadow IT / SaaS Security
Especially when it comes to securing employees' SaaS usage, don't settle for a longer to-do list. Auto-remediation is key to achieving SaaS security.

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy
Aug 06, 2021
Apple on Thursday said it's introducing new child safety features in iOS, iPadOS, watchOS, and macOS as part of its efforts to limit the spread of Child Sexual Abuse Material (CSAM) in the U.S. To that effect, the iPhone maker said it intends to begin client-side scanning of images shared via every Apple device for known child abuse content as they are being uploaded into iCloud Photos, in addition to leveraging on-device machine learning to vet all iMessage images sent or received by minor accounts (aged under 13) to warn parents of sexually explicit photos shared over the messaging platform. Furthermore, Apple also plans to update Siri and Search to stage an intervention when users try to perform searches for CSAM-related topics, alerting that the "interest in this topic is harmful and problematic." "Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit," Apple  noted . "The feature is desi
Cybersecurity Resources