CSAM Perpetrator Research Report: Findings from a Survey of CSAM Perpetrators on Digital Platform Use and Design
top of page

CSAM Perpetrator Research Report: Findings from a Survey of CSAM Perpetrators on Digital Platform Use and Design

Published: 18 March 2026

CSAM Perpetrator Research Report

Findings from a Survey of CSAM Perpetrators on Digital Platform Use and Design



Download Data Annex

CSAM perpetrator research report

Cite this report: Protect Children. (2026). CSAM Perpetrator Research Report: Findings from a Survey of CSAM Perpetrators on Digital Platform Use and Design (Tell Me More About Tech). https://www.protectchildren.fi/en/post/tmat-csam-perpetrator-research-report


Protect Children and Ofcom have conducted global research on online child sexual abuse and exploitation


Each year, millions of images and videos depicting child sexual abuse circulate online, causing profound and long-lasting harm to victims and survivors. Despite efforts to tackle the threat, the scale and accessibility of child sexual abuse material (CSAM) online continues to pose a major global challenge. 


Tackling these appalling online crimes requires up to date evidence on how perpetrators operate across digital environments.


This report presents findings from a large scale anonymous self-reported survey of CSAM perpetrators. The research provides rare insight into perpetrator behaviour across the digital ecosystem, including the platforms, technologies, and online environments used to search for, view, and share CSAM.


About the research


The study is based on a global anonymous self-report survey conducted among individuals searching for CSAM through the dark web search engine Ahmia.fi.

The survey appeared when users searched for terms related to CSAM. Instead of displaying illegal search results, the intervention redirected users to the survey and to prevention resources.


Key facts about the study:

  • Over 20,000 responses analysed globally

  • Survey available in 24 languages

  • Participants recruited while actively searching for CSAM online

  • Ethical approval granted by the Ethics Committee of the Tampere Region


In addition to generating research data, the intervention also encouraged participants to reflect on their behaviour and provided links to help seeking resources.


As the research was conducted in a global and borderless online context, it did not assess the effectiveness of any single national, legal or regulatory online safety regime.  


Quick findings: 


  • Early exposure to pornography and CSAM is a major risk factor. By age 18, 65% of respondents had seen pornography, and 59% had seen CSAM. Many reported that their first encounter with CSAM was accidental. 

  • Perpetrators access CSAM across multiple online environments, using both the dark web and popular open web platforms. A third (33%) felt CSAM has become harder to access, particularly in the past five years. 44% perceived no change, and 23% believed it has become easier. 

  • Generative AI is reshaping and exacerbating CSAM perpetration. 35% reported viewing or creating AI-CSAM. Half of these respondents were involved in commissioning or producing AI-CSAM for profit. 

  • Well-designed deterrence messages can reduce engagement with CSAM.  34% of respondents recalled encountering a warning message when searching for CSAM, with many sharing that it prompted them to reflect on or change their behaviour.  


Why this research matters


Together, the findings of this study provide evidence of the mechanisms enabling online child sexual abuse and exploitation and identify tangible opportunities to reduce risk.


Urgent action is required to strengthen safeguards, implement safety-by-design principles, expand effective moderation, and deploy evidence-based digital interventions that prevent abuse before it escalates.





This report was prepared by Protect Children as part of the Tell Me More About Tech project, which is sponsored by Ofcom.

ofcom logo

 
 
bottom of page