October 10, 2023 at 07:00AM – New Report: Child Sexual Abuse Content and Online Risks to Children on the Rise

October 10, 2023 at 07:00AM

A recent report from Thorn, a nonprofit focused on protecting children from sexual abuse, reveals that minors are increasingly sharing sexual images of themselves online, often as a result of consensual or coercive interactions with adults. The report aligns with findings from other child safety organizations, showing a significant increase in child sexual abuse material (CSAM) being reported. The use of technological tools, such as hashing and matching, is crucial in detecting and removing CSAM from platforms hosting user-generated content. Collaboration among tech companies and NGOs is essential in combating this issue effectively.

According to meeting notes from Oct 10, 2023, there has been an alarming increase in online risks to children, as highlighted by a report from Thorn, a technology nonprofit focused on defending children from sexual abuse. The report reveals that minors are increasingly engaging in consensual or coerced behavior of taking and sharing sexual images of themselves. Risky online interactions with adults have also seen an uptick.

John Starr, VP of Strategic Impact at Thorn, emphasized that child sexual abuse material is easily shared on the platforms people use every day, and predators are exploiting these spaces to share harmful content. This aligns with the findings of other child safety organizations, such as the National Center for Missing and Exploited Children (NCMEC), which has experienced a 329% increase in reported child sexual abuse material (CSAM) files in the last five years.

Several factors contribute to the rise in reports:

1. More platforms are implementing tools like Thorn’s Safer product to detect known CSAM using hashing and matching.
2. Online predators are becoming more brazen and using novel technologies like chatbots to scale their enticement, resulting in an 82% increase in reports of online enticement of children for sexual acts.
3. Self-generated CSAM (SG-CSAM) is also increasing, with a 9% rise from 2021 to 2022 according to the Internet Watch Foundation.

All platforms that host user-generated content, including profile pictures and cloud storage, are at risk of hosting such content. Hashing and matching technology plays a crucial role in detecting and disrupting the spread of CSAM. Perceptual and cryptographic hashing convert files into unique hash values, acting as digital fingerprints for content.

To detect CSAM, content is hashed, and the resulting hash values are compared against hash lists of known CSAM. Thorn’s Safer tool aggregates a database of over 29 million known CSAM hash values, allowing tech companies to block or remove this illicit content from their platforms. Sharing named or anonymous hash lists between companies further expands the database, helping disrupt the viral spread of CSAM.

Both tech companies and NGOs have important roles in eliminating CSAM from the internet. Content-hosting platforms must become key partners in combating child sexual abuse, aided by tools and resources like Safer offered by Thorn. In 2022, Safer hashed over 42.1 billion images and videos, resulting in the identification of 520,000 known CSAM files. To date, Safer has helped customers identify over two million pieces of CSAM on their platforms.

By adopting CSAM detection tools, more platforms can contribute to the reversal of the alarming rise in child sexual abuse material online.

Full Article – https://ift.tt/G9lRUWy