Addressing the distribution of illicit sexual content by minors online

Addressing the distribution of illicit sexual content by minors online

A Stanford Internet Observatory investigation identified large networks of accounts, purportedly operated by minors, selling self-generated illicit sexual content. Platforms have updated safety measures based on the findings, but more work is needed.
stanford dish at sunset Linda A. Cicero

The creation and trading of Child Sexual Abuse Material, or CSAM, is often regarded as the most harmful abuse found across online communication and social media platforms. 

Most of the policy, law enforcement and platform discussion on addressing CSAM rightfully focuses on adult offenders who create, distribute and monetize sexual imagery of children. While a majority of the content that is purchased or traded online is created by adult abusers, in some instances minors (often teenagers) also create this illegal content. They often provide paid offerings modeled after content on well-known adult sites, such as OnlyFans. In other cases, minors are coerced into producing illicit sexual content, known as sextortion, with a dramatic increase in cases reported by the FBI in the past year. 

A new Stanford Internet Observatory report investigates networks on Instagram and Twitter that are involved in advertising and trading self-generated child sexual abuse material (SG-CSAM). These findings were covered by the Wall Street Journal, with responses from the companies named in the report.

Key Takeaways

  • Large networks of accounts that appear to be operated by minors are openly advertising self-generated child sexual abuse material (SG-CSAM) for sale.
  • Instagram is currently the most important platform for these networks with features like recommendation algorithms and direct messaging that help connect buyers and sellers.
  • Twitter had an apparent and now resolved regression allowing CSAM to be posted to public profiles despite hashes of these images being available to platforms and researchers for automated detection and removal.
  • Telegram implicitly allows the trading of CSAM in private channels.
  • Gift card swapping and exchanges, such as G2G, are a critical part of the monetization of SG-CSAM, allowing anonymous compensation for content.
  • Study of these dynamics is challenging but necessary, particularly in an environment where platform providers are divesting from Trust and Safety programs. SIO has implemented systems to study these networks while preventing exposure to or storage of CSAM itself.

 

Our investigation finds that large networks of accounts, purportedly operated by minors, are openly advertising SG-CSAM for sale on social media. Instagram has emerged as the primary platform for such networks, providing features that facilitate connections between buyers and sellers. Instagram's popularity and user-friendly interface make it a preferred option for these activities. The platform's recommendation algorithms effectively advertise SG-CSAM: these algorithms analyze user behaviors and content consumption to suggest related content and accounts to follow. 

These networks are also present on Twitter. In the course of the investigation, researchers found that despite the availability of image hashes to identify and remove known CSAM, Twitter experienced an apparent regression in its mitigation of the problem. Using PhotoDNA, a common detection system for identified instances of known CSAM, matches were identified on public profiles, bypassing safeguards that should have been in place to prevent the spread of such content. This gap was disclosed to Twitter’s Trust & Safety team which responded to address the issue. However, the failure highlights the need for platforms to prioritize user safety and the importance of collaborative research efforts to mitigate and proactively counter online child abuse and exploitation

While the primary platforms identified as having significant SG-CSAM activity were Instagram and Twitter, a wide cross-section of the industry is leveraged by this ecosystem—some of which we could not analyze in-depth using open-source methods. 

An industry-wide initiative is needed to limit production, discovery, advertisement and distribution of SG-CSAM; more resources should be devoted to proactively identifying and stopping abuse. These networks utilize not only social media platforms, but file sharing services, merchants, and payment providers. Given the multi-platform nature of the problem, addressing it will require better information sharing about production networks, countermeasures, and methods for identifying buyers.

SIO hopes that this research aids industry and non-profits in their efforts to remove child sexual abuse material from the internet. Platforms have updated safety measures based on the findings, but more work is needed. We will continue to partner with technology and child safety organizations to conduct further research and recommend countermeasures.

Read More

elon musk black and white in profile
Commentary

The Twitter Files Are a Missed Opportunity

No one really knows what Elon Musk’s company is doing to free speech. (From The Atlantic)
cover link The Twitter Files Are a Missed Opportunity
Fake profiles real children internet observatory
Blogs

Fake Profiles, Real Children

A Look at the Use of Stolen Child Imagery in Social Media Role-Playing Games
cover link Fake Profiles, Real Children
image of people sitting at round tables listening to a speaker on the stage
Blogs

Trust & Safety Research Conference Announced for September 28-29, 2023

The second annual Trust & Safety Research Conference, sponsored by the Stanford Internet Observatory, will take place at the Alumni Center at Stanford University
cover link Trust & Safety Research Conference Announced for September 28-29, 2023