#1 Trusted Cybersecurity News Platform
Followed by 4.50+ million
The Hacker News Logo
Subscribe – Get Latest News
AI Security

CSAM Detection | Breaking Cybersecurity News | The Hacker News

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

Oct 10, 2023 Cybersecurity / Online Security
Certain online risks to children are on the rise, according to a recent report from Thorn , a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the  Emerging Online Trends in Child Sexual Abuse 2023 report , indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults. "In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives," said John Starr, VP of Strategic Impact at Thorn. "Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content." These trends and others shared in the Emerging O
E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

May 12, 2022
The European Commission on Wednesday proposed new regulation that would require tech companies to scan for child sexual abuse material (CSAM) and grooming behavior, raising worries that it could undermine end-to-end encryption (E2EE). To that end, online service providers, including hosting services and communication apps, are expected to proactively scan their platforms for CSAM as well as report, remove and disable access to such illicit content. While instant messaging services like WhatsApp  already   rely  on hashed versions of known CSAM to automatically block new uploads of images or videos matching them, the new plan requires such platforms to identify and flag new instances of CSAM. "Detection technologies must only be used for the purpose of detecting child sexual abuse," the regulator  said . "Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rat
How to Increase Engagement with Your Cybersecurity Clients Through vCISO Reporting

How to Increase Engagement with Your Cybersecurity Clients Through vCISO Reporting

Jul 22, 2024vCISO / Business Security
As a vCISO, you are responsible for your client's cybersecurity strategy and risk governance. This incorporates multiple disciplines, from research to execution to reporting. Recently, we published a comprehensive playbook for vCISOs, "Your First 100 Days as a vCISO – 5 Steps to Success" , which covers all the phases entailed in launching a successful vCISO engagement, along with recommended actions to take, and step-by-step examples.  Following the success of the playbook and the requests that have come in from the MSP/MSSP community, we decided to drill down into specific parts of vCISO reporting and provide more color and examples. In this article, we focus on how to create compelling narratives within a report, which has a significant impact on the overall MSP/MSSP value proposition.  This article brings the highlights of a recent guided workshop we held, covering what makes a successful report and how it can be used to enhance engagement with your cyber security clients.
Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Sep 04, 2021
Apple is temporarily hitting the pause button on its  controversial plans  to screen users' devices for child sexual abuse material (CSAM) after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the iPhone maker  said  in a statement on its website. The announcement, however, doesn't make it clear as to the kind of inputs it would be gathering, the nature of changes it aims to devise, or how it intends to implement the system in a way that mitigates the privacy and security concerns that could arise once it's deployed. The changes were originally slated to go live with iOS 15 and macOS Monterey later this year, starting with the U.S. In
cyber security

Free OAuth Investigation Checklist - How to Uncover Risky or Malicious Grants

websiteNudge SecuritySaaS Security / Supply Chain
OAuth grants provide yet another way for attackers to compromise identities. Download our free checklist to learn what to look for and where when reviewing OAuth grants for potential risks.
Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Aug 06, 2021
Apple on Thursday said it's introducing new child safety features in iOS, iPadOS, watchOS, and macOS as part of its efforts to limit the spread of Child Sexual Abuse Material (CSAM) in the U.S. To that effect, the iPhone maker said it intends to begin client-side scanning of images shared via every Apple device for known child abuse content as they are being uploaded into iCloud Photos, in addition to leveraging on-device machine learning to vet all iMessage images sent or received by minor accounts (aged under 13) to warn parents of sexually explicit photos shared over the messaging platform. Furthermore, Apple also plans to update Siri and Search to stage an intervention when users try to perform searches for CSAM-related topics, alerting that the "interest in this topic is harmful and problematic." "Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit," Apple  noted . "The feature is desi
Cybersecurity
Expert Insights
Cybersecurity Resources