#1 Trusted Cybersecurity News Platform Followed by 4.50+ million
The Hacker News Logo
Subscribe – Get Latest News
Cloud Security

CSAM Detection | Breaking Cybersecurity News | The Hacker News

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

Oct 10, 2023 Cybersecurity / Online Security
Certain online risks to children are on the rise, according to a recent report from Thorn , a technology nonprofit whose mission is to build technology to defend children from sexual abuse. Research shared in the  Emerging Online Trends in Child Sexual Abuse 2023 report , indicates that minors are increasingly taking and sharing sexual images of themselves. This activity may occur consensually or coercively, as youth also report an increase in risky online interactions with adults. "In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives," said John Starr, VP of Strategic Impact at Thorn. "Harmful interactions between youth and adults are not isolated to the dark corners of the web. As fast as the digital community builds innovative platforms, predators are co-opting these spaces to exploit children and share this egregious content." These trends and others shared in the Emerging O
E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

E.U. Proposes New Rules for Tech Companies to Combat Online Child Sexual Abuse

May 12, 2022
The European Commission on Wednesday proposed new regulation that would require tech companies to scan for child sexual abuse material (CSAM) and grooming behavior, raising worries that it could undermine end-to-end encryption (E2EE). To that end, online service providers, including hosting services and communication apps, are expected to proactively scan their platforms for CSAM as well as report, remove and disable access to such illicit content. While instant messaging services like WhatsApp  already   rely  on hashed versions of known CSAM to automatically block new uploads of images or videos matching them, the new plan requires such platforms to identify and flag new instances of CSAM. "Detection technologies must only be used for the purpose of detecting child sexual abuse," the regulator  said . "Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rat
AI Copilot: Launching Innovation Rockets, But Beware of the Darkness Ahead

AI Copilot: Launching Innovation Rockets, But Beware of the Darkness Ahead

Apr 15, 2024Secure Coding / Artificial Intelligence
Imagine a world where the software that powers your favorite apps, secures your online transactions, and keeps your digital life could be outsmarted and taken over by a cleverly disguised piece of code. This isn't a plot from the latest cyber-thriller; it's actually been a reality for years now. How this will change – in a positive or negative direction – as artificial intelligence (AI) takes on a larger role in software development is one of the big uncertainties related to this brave new world. In an era where AI promises to revolutionize how we live and work, the conversation about its security implications cannot be sidelined. As we increasingly rely on AI for tasks ranging from mundane to mission-critical, the question is no longer just, "Can AI  boost cybersecurity ?" (sure!), but also "Can AI  be hacked? " (yes!), "Can one use AI  to hack? " (of course!), and "Will AI  produce secure software ?" (well…). This thought leadership article is about the latter. Cydrill  (a
Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Apple Delays Plans to Scan Devices for Child Abuse Images After Privacy Backlash

Sep 04, 2021
Apple is temporarily hitting the pause button on its  controversial plans  to screen users' devices for child sexual abuse material (CSAM) after receiving sustained blowback over worries that the tool could be weaponized for mass surveillance and erode the privacy of users. "Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the iPhone maker  said  in a statement on its website. The announcement, however, doesn't make it clear as to the kind of inputs it would be gathering, the nature of changes it aims to devise, or how it intends to implement the system in a way that mitigates the privacy and security concerns that could arise once it's deployed. The changes were originally slated to go live with iOS 15 and macOS Monterey later this year, starting with the U.S. In
cyber security

Today's Top 4 Identity Threat Exposures: Where To Find Them and How To Stop Them

websiteSilverfortIdentity Protection / Attack Surface
Explore the first ever threat report 100% focused on the prevalence of identity security gaps you may not be aware of.
Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Apple to Scan Every Device for Child Abuse Content — But Experts Fear for Privacy

Aug 06, 2021
Apple on Thursday said it's introducing new child safety features in iOS, iPadOS, watchOS, and macOS as part of its efforts to limit the spread of Child Sexual Abuse Material (CSAM) in the U.S. To that effect, the iPhone maker said it intends to begin client-side scanning of images shared via every Apple device for known child abuse content as they are being uploaded into iCloud Photos, in addition to leveraging on-device machine learning to vet all iMessage images sent or received by minor accounts (aged under 13) to warn parents of sexually explicit photos shared over the messaging platform. Furthermore, Apple also plans to update Siri and Search to stage an intervention when users try to perform searches for CSAM-related topics, alerting that the "interest in this topic is harmful and problematic." "Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit," Apple  noted . "The feature is desi
Cybersecurity Resources