#1 Trusted Cybersecurity News Platform Followed by 4.50+ million
The Hacker News Logo
Get the Free Newsletter
SaaS Security

CSAM Detection | Breaking Cybersecurity News | The Hacker News

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

Oct 10, 2023 Cybersecurity / Online Security
Certain online risks to children are on the rise, according to a recent report from Thorn , a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the  Emerging Online Trends in Child Sexual Abuse 2023 report , indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults. "In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives," said John Starr, VP of Strategic Impact at Thorn. "Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content." These trends and others shared in the Emerging O
E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

May 12, 2022
The European Commission on Wednesday proposed new regulation that would require tech companies to scan for child sexual abuse material (CSAM) and grooming behavior, raising worries that it could undermine end-to-end encryption (E2EE). To that end, online service providers, including hosting services and communication apps, are expected to proactively scan their platforms for CSAM as well as report, remove and disable access to such illicit content. While instant messaging services like WhatsApp  already   rely  on hashed versions of known CSAM to automatically block new uploads of images or videos matching them, the new plan requires such platforms to identify and flag new instances of CSAM. "Detection technologies must only be used for the purpose of detecting child sexual abuse," the regulator  said . "Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rat
SaaS Compliance through the NIST Cybersecurity Framework

SaaS Compliance through the NIST Cybersecurity Framework

Feb 20, 2024Cybersecurity Framework / SaaS Security
The US National Institute of Standards and Technology (NIST) cybersecurity framework is one of the world's most important guidelines for securing networks. It can be applied to any number of applications, including SaaS.  One of the challenges facing those tasked with securing SaaS applications is the different settings found in each application. It makes it difficult to develop a configuration policy that will apply to an HR app that manages employees, a marketing app that manages content, and an R&D app that manages software versions, all while aligning with NIST compliance standards.  However, there are several settings that can be applied to nearly every app in the SaaS stack. In this article, we'll explore some universal configurations, explain why they are important, and guide you in setting them in a way that improves your SaaS apps' security posture.  Start with Admins Role-based access control (RBAC) is a key to NIST adherence and should be applied to every SaaS a
Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Sep 04, 2021
Apple is temporarily hitting the pause button on its  controversial plans  to screen users' devices for child sexual abuse material (CSAM) after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the iPhone maker  said  in a statement on its website. The announcement, however, doesn't make it clear as to the kind of inputs it would be gathering, the nature of changes it aims to devise, or how it intends to implement the system in a way that mitigates the privacy and security concerns that could arise once it's deployed. The changes were originally slated to go live with iOS 15 and macOS Monterey later this year, starting with the U.S. In
cyber security

Are You Vulnerable to Third-Party Breaches Through Interconnected SaaS Apps?

websiteWing SecuritySaaS Security / Risk Management
Protect against cascading risks by identifying and mitigating app2app and third-party SaaS vulnerabilities.
Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Aug 06, 2021
Apple on Thursday said it's introducing new child safety features in iOS, iPadOS, watchOS, and macOS as part of its efforts to limit the spread of Child Sexual Abuse Material (CSAM) in the U.S. To that effect, the iPhone maker said it intends to begin client-side scanning of images shared via every Apple device for known child abuse content as they are being uploaded into iCloud Photos, in addition to leveraging on-device machine learning to vet all iMessage images sent or received by minor accounts (aged under 13) to warn parents of sexually explicit photos shared over the messaging platform. Furthermore, Apple also plans to update Siri and Search to stage an intervention when users try to perform searches for CSAM-related topics, alerting that the "interest in this topic is harmful and problematic." "Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit," Apple  noted . "The feature is desi
Cybersecurity Resources